Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please help, paging doesn't work, I am getting MAD ! #360

Open
typhoonsimon opened this issue Mar 12, 2020 · 19 comments · May be fixed by #363
Open

Please help, paging doesn't work, I am getting MAD ! #360

typhoonsimon opened this issue Mar 12, 2020 · 19 comments · May be fixed by #363

Comments

@typhoonsimon
Copy link

Hi,

I have installed and synced the explorer, and fixed many things that I will pull to your repo. Please help me with this issue cause I am getting MAD. The page limit doesn't work, and the accesses are KILLING my server:
http://lcp.altcoinwarz.com

I have isolated the issue in the query part:
get_last_txs_ajax: function(start, length, min, cb) {
Tx.countDocuments({'total': {$gte: min}}, function(err, count){
Tx.find({'total': {$gte: min}}).sort({blockindex: 'desc'}).skip(Number(start)).limit(Number(length)).exec(function(err, txs){
if (err) {
return cb(err);
} else {
return cb(txs, count);
}
});
});
},

No matter WHAT you pass to that function, it returns always the entire DATABASE. Please help me because this is very urgent for me ! I am sure everything is correct. I tested on different OSes already, same issue.

Thank you,
Cheers,
Simone

@uaktags
Copy link
Collaborator

uaktags commented Mar 12, 2020

Have you isolated this down by running the actual mongo commands themselves directly? Also, if you have a repo, perhaps you can show it or show the pr just to see if there may be other conflicting issues. Just by the snippet we already know what it's doing, which is by design:

  1. Count the total documents in Tx, where the total is >= the min. Pass any error as err and the count as count.
  2. Within Tx, find all documents where the total is >= the min, sort it by blockindex in an descending order. However, skip by the total number provided in the start variable, and limit it by the number provided in the length variable. Pass any error as 'err' and give us this set of documents by txs.

So here, we know that it should indeed only be returning you a total of "length". I'd console.log the variables just to be sure we know what we're looking at in term of the variables being passed so it can be tested against the db.

@uaktags
Copy link
Collaborator

uaktags commented Mar 12, 2020

Never mind, I think i see where your concern is.

explorer/views/index.pug

Lines 46 to 77 in 56e3752

var rtable = $('#recent-table').dataTable( {
autoWidth: false,
searching: false,
ordering: false,
responsive: false,
lengthChange: true,
processing: true,
serverSide: true,
iDisplayLength: displayLengthMax,
lengthMenu: lengthMenuOpts,
ajax: '/ext/getlasttxsajax/0',
rowCallback: function(row, data, index) {
var blockindex = data[0]; //variables for better readability
var blockhash = data[1]; //variables for better readability
var txhash = data[2]; //variables for better readability
var outputs = data[3]; //variables for better readability
var amount = (data[4] / 100000000).toLocaleString('en',{'minimumFractionDigits':2,'maximumFractionDigits':8,'useGrouping':true}); //variables for better readability
var amountParts = amount.split('.');
var amount = amountParts[0] + '.<span class="decimal">' + amountParts[1] + '</span>';
var timestamp = data[5]; //variables for better readability
$("td:eq(0)", row).html('<a href="/block/' + blockhash + '">' + blockindex + '</a>');
$("td:eq(1)", row).html('<a href="/tx/' + txhash + '">' + txhash + '</a>').addClass("d-none d-md-none d-lg-table-cell text-center hidden-xs");
$("td:eq(2)", row).html(outputs).addClass("d-none d-md-none d-lg-table-cell text-center hidden-xs");
$("td:eq(3)", row).html(amount);
$("td:eq(4)", row).html(timestamp);
},
});
setInterval( function () {
rtable.api().ajax.reload(null, false);
stable.api().ajax.reload(null, false);
}, 60000 );
});

explorer/app.js

Lines 147 to 173 in 56e3752

app.use('/ext/getlasttxsajax/:min', function(req,res){
if(typeof req.query.length === 'undefined' || isNaN(req.query.length) || req.query.length > settings.index.last_txs){
req.query.length = settings.index.last_txs;
}
if(typeof req.query.start === 'undefined' || isNaN(req.query.start) || req.query.start < 0){
req.query.start = 0;
}
if(typeof req.params.min === 'undefined' || isNaN(req.params.min ) || req.params.min < 0){
req.params.min = 0;
} else {
req.params.min = (req.params.min * 100000000);
}
db.get_last_txs_ajax(req.query.start, req.query.length, req.params.min,function(txs, count){
var data = [];
for(i=0; i<txs.length; i++){
var row = [];
row.push(txs[i].blockindex);
row.push(txs[i].blockhash);
row.push(txs[i].txid);
row.push(txs[i].vout.length);
row.push((txs[i].total));
row.push(new Date((txs[i].timestamp) * 1000).toUTCString());
data.push(row);
}
res.json({"data":data, "draw": req.query.draw, "recordsTotal": count, "recordsFiltered": count});
});
});

In reality, the expected (and desired) behavior is that the table would only return the 10 rows (or however many is selected), and then as you page you get the next set. Whereas by default right now, the entire dataset is returned, and then the datatables is configured to page after the fact (rather than in realtime). The code, we've opted at some point to reduce the calls to 1 call, rather than multiple, which is great for smaller blockchains, but appears and is fairly obvious to be disastrous for larger ones.

@typhoonsimon
Copy link
Author

Hi guys,

The 1 query is the right way to go, but it has to be the correct query. If you select millions or records to filter out 10, is not. I am now studying Mongo and will change the query. We can then think of putting a selector in the settings for the type of query to execute.

Cheers,
Regards,
Simone

@typhoonsimon
Copy link
Author

It is not only the main list, every other page is so slow and unusable. Do you have a quick fix to let me go back to the old way ? It is killing my server completely, and is a 12 core i7 9th generation......

@typhoonsimon
Copy link
Author

Nothing, I have downgraded it to an old version I have, it's so slow the super station I have and mongo crashed altogether. I you have a quick suggestion, I'd give it another try.

@typhoonsimon
Copy link
Author

typhoonsimon commented Mar 13, 2020

The old one, indexed with the new sync.js, of course doesn't work properly too (after 5 days of syncing................):
http://lcp.altcoinwarz.com/address/XFLxtqNoJz3hS8H5YENEDfP4F2i8y94FzK. txs are no more saved into the address object.

If you guys could please find a mongodb query that can replace this monster (database.js):
Tx.countDocuments({'total': {$gt: min}}, function(err, count){ Tx.find({'total': {$gt: min}}).skip(Number(start)).sort({blockindex: 'desc'}).limit(Number(length)).exec(function(err, txs){

EDIT: I have tried a billion ways, to the point of madness, for 2 days. There's absolutely no way to get anything out of Mongodb other than a query that last 20s.

Thank you,
Cheers,
Simone

@uaktags
Copy link
Collaborator

uaktags commented Mar 13, 2020

I'm not sure where the 20seconds is coming from, because I can easily hit http://lcp.altcoinwarz.com/ext/getlasttxsajax/0 in less than 5seconds, regardless...

The issue is you're dealing with alot of data to try to parse through and the way that the data is stored. You basically have a coin that has made it to a point where either the data's storage scheme needs to be greatly altered (and syncing will have to be exponentially increased for the processing) or you'll have to use live RPC calls like insight.

Remember, you're querying Transactions. A transactions table will always be the largest aspect in any scheme. We don't store by blocks (which would possibly make front page faster) and addressHistory is a new thing so old versions won't have it.

The "quick fix" would be to make the "Latest Transactions" only be the Latest X Transactions, and not return all to the datatable.

Hope that helps.

Also, what other fixes have you coded for the PR?

@typhoonsimon
Copy link
Author

I am using the old version of explorer now, that's why it works. Otherwise my user would strangle me for bringing all exchanges down again.

I already fixed one problem, the index view. You guys forgot some indexes in the mongodb table and query has been optimized.

Now onto the address query page, is also unusable. I already find the problem, and is the aggregate query, the sort part:
AddressTx.aggregate([ { $match: { a_id: hash } }, { $sort: {blockindex: -1} }, { $skip: Number(start) }, { $group: { _id: '', balance: { $sum: '$amount' } } },

Can you find a way to go around that ? Like writing two queries ? The sort over a view cannot be indexed, and will always be slow.

Simone

@typhoonsimon
Copy link
Author

PS: have done many fixed, like PoW + PoS display, peer sync works properly now (number of peers = peers db), will push back to your repo once I'm done.

Simone

@typhoonsimon
Copy link
Author

I have solved all, you must make a compound index for that query on the addresses, otherwise it would make the CPU go crazy....

db.addresstxes.createIndex({a_id: 1, blockindex: -1})
After adding this to mongo, it is now lightning fast.
http://lcp.altcoinwarz.com:81/

I activated a test version. I will pull the changes to you, but the DB indexes you will need to add to the code as I don't know where they are (I'll list them here).

Simone

@uaktags
Copy link
Collaborator

uaktags commented Mar 13, 2020

Have you tried expanding the front index view back to showing All records rather than 1000 to see the performance difference for a 1:1 comparison?

You can add the index to the https://github.com/iquidus/explorer/blob/master/models/addresstx.js

@typhoonsimon
Copy link
Author

@uaktags yes, it will be flashing fast, so the index worked. But I don't want that for my users. They can search transactions by hands if necessary.

OK, I will add the indexes there. But not only that table, also the table txes I add to add at least 2 indexes.

Cheers,
Simone

@uaktags
Copy link
Collaborator

uaktags commented Mar 13, 2020

If you have a test version, i'd love to see it flashing fast with all millions, as that's the default that everyone has in the current iteration of 1.7.3 to show all documents (unless your PR includes settings.json updates for configurability).
Please note that alot of these already have indexes, so it'd be curious to see how your indexes change the indexes in place.
Can't wait!

@uaktags
Copy link
Collaborator

uaktags commented Mar 13, 2020

http://lcp.altcoinwarz.com:81/ext/getlasttxsajax/500099909999
vs
http://lcp.altcoinwarz.com:81/ext/getlasttxsajax/0
is showing the same results, so it either appears somethings broken, or you changed something with the getlasttxsajax/:min functionality.

@typhoonsimon
Copy link
Author

typhoonsimon commented Mar 13, 2020

I have modified that, to force pass 0. For the moment I intend to run like this, but I will try to run it with the correct options later on.

When I am sure no critical performance may bring it down, I will leave it full. I cannot allow even 2 minutes of the explorer down, or exchanges will immediately suspend all withdraw/deposits, with all of our users inundating the support group :)

Simone

@typhoonsimon
Copy link
Author

typhoonsimon commented Mar 14, 2020

Sorry I don't know how to add the indexes. Once I pushed to your repo, please add them yourself, they need to be declared separately from the schema, and added afterwards:

txes
db.txes.createIndex({total: 1})
db.txes.createIndex({total: -1})
db.txes.createIndex({blockindex: 1})
db.txes.createIndex({blockindex: -1})

addresstxes
db.addresstxes.createIndex({a_id: 1, blockindex: -1})

Simone

@typhoonsimon
Copy link
Author

Also: documentsCount() shall be avoided as much as possible. Is like doing two queries. Over large databases, it double the efforts, for nothing. Is better use other means whenever the entire count is needed, like the block count.

I have finished all the changes, and verify it is super fast also over entire database. I will update the repo and pull to your side.

Cheers,
Simone

@uaktags
Copy link
Collaborator

uaktags commented Mar 14, 2020

db.txes.createIndex({total: 1})
Is not needed as its already in the schema, in fact trying to do it from mongo cli should warn/error out.
These all should be added at the tine of the schema and will go into the models.

Still awaiting the pr.

@uaktags uaktags linked a pull request Mar 14, 2020 that will close this issue
@typhoonsimon
Copy link
Author

Cheers man. If you set last_txs=0 in the config file, it will show entire chain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants