A very frequent use case is finding top N keys by memory usage. For small databases, this is easily achieved by taking the csv file and sorting out.
But for databases with more than 64K keys, the csv approach fails because excel cannot handle that many rows.
The solution is to have a script that does a linear pass of all keys, maintain the top N keys by memory usage, and then finally print that out. Perhaps we could also maintain some more statistics as well.
See discussion on this over here - http://stackoverflow.com/questions/13673058/what-is-the-easiest-way-to-find-the-biggest-objects-in-redis/13681596#comment18794833_13681596
I'm using Zoho for opening and sorting the CSV file now, after even Google Docs said the file was too big (their limit is 400,000 cells and 20MB). Zoho hasn't complained yet.