Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Small program to understand the composition of your Redis data set
Ruby
Tree: 5ca4404122

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.
COPYING
README
redis-sampler.rb

README

Redis Sampler README
====================

Redis sampler is an utility to check the composition of your Redis dataset.

Using it is as simple as typing:

    ./redis-sampler.rb <host> <port> <db> <samplesize>

The host and port arguments are the ones of your Redis instance.
The DB is the database to test. By default Redis uses database 0 unless you
are writing against a different DB.

The sample size is the number of elements to check using RANDOMKEY.
For large datasets a sample size of 100000 or more is recommended.

To understand how the program works you can try a few runs setting the
sampling size to just 1000, for fast execution.

It is highly recommended to run the program via loopback interface or fast
LAN, otherwise the execution time will be pretty high.

Example Output
==============

Sampling localhost:6379 DB:4 with 100000 RANDOMKEYS

TYPES
=====
 zset: 87558 (87.56%)     string: 12265 (12.27%)   set: 106 (0.11%)        
 hash: 50 (0.05%)         list: 21 (0.02%)        

STRINGS, SIZE OF VALUES
=======================
 6: 6065 (6.07%)          7: 1611 (1.61%)          13: 835 (0.83%)         
 15: 809 (0.81%)          5: 469 (0.47%)           31: 333 (0.33%)         
 20: 250 (0.25%)          2: 248 (0.25%)           27: 147 (0.15%)         
 42: 117 (0.12%)          159: 81 (0.08%)          47: 70 (0.07%)          
 1: 63 (0.06%)            28: 56 (0.06%)           34: 53 (0.05%)          
 41: 45 (0.04%)           38: 41 (0.04%)           22: 37 (0.04%)          
 139: 33 (0.03%)          29: 31 (0.03%)           55: 31 (0.03%)          
(suppressed 132 items with perc < 0.5% for a total of 88.58%)
Average: 15.61 Standard Deviation: 25.38

LISTS, NUMBER OF ELEMENTS
=========================
 5: 4 (0.00%)             7: 3 (0.00%)             13: 2 (0.00%)           
 11: 2 (0.00%)            8: 2 (0.00%)             12: 2 (0.00%)           
 10: 2 (0.00%)            2: 1 (0.00%)             25: 1 (0.00%)           
 15: 1 (0.00%)            14: 1 (0.00%)           
Average: 9.76 Standard Deviation: 4.84

LISTS, SIZE OF ELEMENTS
=======================
 7: 11 (0.01%)            6: 10 (0.01%)           
Average: 6.52 Standard Deviation: 0.50

SETS, NUMBER OF ELEMENTS
========================
 1: 21764 (21.76%)        2: 10662 (10.66%)        3: 6828 (6.83%)         
 4: 4904 (4.90%)          5: 3515 (3.52%)          6: 2974 (2.97%)         
 7: 2351 (2.35%)          8: 2178 (2.18%)          9: 2012 (2.01%)         
 10: 1773 (1.77%)         11: 1683 (1.68%)         12: 1511 (1.51%)        
 13: 1493 (1.49%)         15: 1286 (1.29%)         14: 1245 (1.25%)        
 16: 1104 (1.10%)         17: 1012 (1.01%)         18: 921 (0.92%)         
 20: 801 (0.80%)          19: 760 (0.76%)          22: 745 (0.74%)         
 21: 673 (0.67%)          23: 584 (0.58%)          24: 565 (0.56%)         
 25: 521 (0.52%)          27: 455 (0.46%)         
(suppressed 896 items with perc < 0.5% for a total of 25.68%)
Average: 1.15 Standard Deviation: 0.63

SETS, SIZE OF ELEMENTS
======================
 19: 93 (0.09%)           3: 8 (0.01%)             5: 2 (0.00%)            
 4: 2 (0.00%)             2: 1 (0.00%)            
Average: 17.08 Standard Deviation: 5.13

SORTED SETS, NUMBER OF ELEMENTS
===============================
 1: 21764 (21.76%)        2: 10662 (10.66%)        3: 6828 (6.83%)         
 4: 4904 (4.90%)          5: 3515 (3.52%)          6: 2974 (2.97%)         
 7: 2351 (2.35%)          8: 2178 (2.18%)          9: 2012 (2.01%)         
 10: 1773 (1.77%)         11: 1683 (1.68%)         12: 1511 (1.51%)        
 13: 1493 (1.49%)         15: 1286 (1.29%)         14: 1245 (1.25%)        
 16: 1104 (1.10%)         17: 1012 (1.01%)         18: 921 (0.92%)         
 20: 801 (0.80%)          19: 760 (0.76%)          22: 745 (0.74%)         
 21: 673 (0.67%)          23: 584 (0.58%)          24: 565 (0.56%)         
 25: 521 (0.52%)          27: 455 (0.46%)         
(suppressed 896 items with perc < 0.5% for a total of 25.68%)
Average: 25.21 Standard Deviation: 107.92

SORTED SETS, SIZE OF ELEMENTS
=============================
 6: 71045 (71.05%)        5: 7638 (7.64%)          4: 6924 (6.92%)         
 3: 1763 (1.76%)          2: 137 (0.14%)           9: 24 (0.02%)           
 1: 8 (0.01%)             8: 2 (0.00%)             30: 2 (0.00%)           
 13: 2 (0.00%)            32: 2 (0.00%)            7: 1 (0.00%)            
 23: 1 (0.00%)            34: 1 (0.00%)            25: 1 (0.00%)           
 39: 1 (0.00%)            33: 1 (0.00%)            48: 1 (0.00%)           
 27: 1 (0.00%)            54: 1 (0.00%)            10: 1 (0.00%)           
(suppressed 1 items with perc < 0.5% for a total of 12.44%)
Average: 5.69 Standard Deviation: 0.81

HASHES, NUMBER OF FIELDS
========================
 1: 24 (0.02%)            12: 16 (0.02%)           11: 9 (0.01%)           
 13: 1 (0.00%)           
Average: 6.56 Standard Deviation: 5.36

HASHES, SIZE OF FIELDS
======================
 17: 24 (0.02%)           22: 17 (0.02%)           13: 9 (0.01%)           
Average: 17.98 Standard Deviation: 3.23

HASHES, SIZE OF VALUES
======================
 13: 10 (0.01%)           3: 10 (0.01%)            408: 4 (0.00%)          
 407: 3 (0.00%)           409: 3 (0.00%)           14: 2 (0.00%)           
 6: 2 (0.00%)             396: 2 (0.00%)           5: 1 (0.00%)            
 2: 1 (0.00%)             354: 1 (0.00%)           12: 1 (0.00%)           
 368: 1 (0.00%)           393: 1 (0.00%)           360: 1 (0.00%)          
 412: 1 (0.00%)           379: 1 (0.00%)           410: 1 (0.00%)          
 405: 1 (0.00%)           411: 1 (0.00%)           378: 1 (0.00%)          
(suppressed 1 items with perc < 0.5% for a total of 99.95%)
Average: 187.34 Standard Deviation: 194.57

Something went wrong with that request. Please try again.