New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Array buffer allocation failed using NeuralNetworkGPU #434
Comments
Can you share any of your scripting? I feel like this is a memory leak. |
Of course. The script can be found here. |
What about thee training datasets? And how many are there? |
The samples look like this The dataset for spammails I'm using is around 1389 samples big (and that for legit mails are 130). |
Are you encoding them? It would be very odd to feed this type of data into |
No, I'm just slapping them into an array and passing that to |
I would normalize or encode the data first, then feed them in. If you are using |
I normalize all the data in my set using the following snippet now: function (string){
var input = [];
for(let i=0; i<string.length; i++){
input.push(string.charCodeAt(i)/1000);
}
return input;
} The RAM is still climbing but not remotely as sharp anymore. |
Unfortunately, after a little while, it starts to climb up hard again and crashes again with the same error :\ |
I have currently given up on this project since I cannot find a way to get around the memory issue with even 128GB of RAM :\ |
Do you have 128GB of GPU memory? |
I consider this a high priority issue, will be giving it more attention when this lands: stackgl/headless-gl#168 |
no, 128GB system RAM ("CPU RAM") on a machine at work (which GPU has 8GB). I have tried to look into "The Hashing Trick" but I can't make sense of that (maybe Brain.JS could do something with it?). |
I'll look at this tomorrow morning, GPU problems are highest priority. |
I'll try to see what happens if I use |
Same with CPU only... After normalization, the samples look like this:
Though the sizes might vary. |
Can I get you to post sample data again? The link expired. |
This line seems to be erroneous, as // Push the content into our trainingData
trainingData.push({
input: content,
output: dataset
}); |
I think I'm on to something, this is a key indicator:
The issue happens in what seems to be initial setup of the app, copying data so that it is a format that will not leak memory, but ironically... it may leak memory. Still investigating. |
If you start out in |
I get nothing but stable nets when the network is running, so this beacons the question: How much training data is there in megabytes? |
With the latest version on my With the latest version on my My variables in this piece of code are not really intuitive but trainingData.push({
input: content,
output: dataset
}); |
What happens in my code:
After this is done, I run After loading up that, it will call I hope I managed to clear up a bit what my code does :) |
How long is the longest email string? |
After this normalization, the longest one (guesstimated by just looking at the biggest samples) are |
I have attached the biggest one for the |
That is the problem. You are creating a net with 95,282 inputs. Have you tried a LSTM network with this type of data? |
To be clear, that'll be 95,282 inputs, 47,641 hidden layers, (and since hidden layers are 2d, that is |
You could try something like this: https://medium.com/@tech_fort/classifying-text-with-neural-networks-and-mimir-in-javascript-94c9de20c0ac |
They actually use brain (the older deprecated one) with a bag of words technique. |
Since we know what is causing the net to grow so large, I'm going to go ahead and close. I'm interested in hearing how other techniques work out, however. |
Oh wow, no wonder that happens then no. I'm already looking into BOW along with "The Hashing Trick" and will report back when I'm ready with that. |
I just counted the longest input you had (input neurons, or values), hidden layers (when not defined) are calculated here and it is a dot matrix multiplication, the hidden layer matrix size is height (input size) times width (hidden layer size), and the output layer I actually got wrong, it is 2, not 1. |
Ah I see. |
saw nearly 10% improvement in speed in debugging #434
Looking forward to it! Maybe we can include some of it in brain.js. |
For the record, here is what I was using to test, and actually find a slight performance improvement for the older const { NeuralNetworkGPU, NeuralNetwork } = require('./src');
const trainingInputSample = '0.065,0.066,0.067,0.032,0.077,0.069,0.078,0.083,0.032,0.072,0.069,0.065,0.076,0.084,0.072,0.032,0.082,0.069,0.08,0.079,0.082,0.084,0.01,0.01,0.01,0.01,0.074,0.117,0.108,0.121,0.044,0.032,0.05,0.053,0.116,0.104,0.032,0.05,0.048,0.049,0.056,0.01,0.069,0.118,0.101,0.11,0.105,0.11,0.103,0.032,0.082,0.101,0.112,0.111,0.114,0.116,0.032,0.045,0.032,0.072,0.101,0.097,0.108,0.116,0.104,0.121,0.032,0.077,0.101,0.11,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.109,0.105,0.103,0.104,0.116,0.032,0.098,0.101,0.032,0.051,0.048,0.043,0.032,0.105,0.11,0.032,0.097,0.103,0.101,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.114,0.101,0.032,0.105,0.115,0.032,0.11,0.111,0.032,0.114,0.101,0.097,0.115,0.111,0.11,0.032,0.121,0.111,0.117,0.032,0.097,0.114,0.101,0.11,0.116,0.032,0.119,0.097,0.107,0.105,0.11,0.103,0.032,0.117,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.111,0.114,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.115,0.108,0.101,0.101,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.097,0.032,0.115,0.116,0.105,0.102,0.102,0.045,0.101,0.114,0.101,0.099,0.116,0.105,0.111,0.11,0.046,0.01,0.01,0.01,0.01,0.073,0.116,0.032,0.103,0.101,0.116,0.115,0.032,0.121,0.111,0.117,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.102,0.097,0.115,0.116,0.032,0.097,0.11,0.1,0.032,0.107,0.101,0.101,0.112,0.115,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.097,0.108,0.108,0.032,0.11,0.105,0.103,0.104,0.116,0.032,0.108,0.111,0.11,0.103,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.076,0.111,0.111,0.107,0.032,0.11,0.111,0.119,0.032,0.116,0.111,0.032,0.115,0.101,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.102,0.111,0.114,0.117,0.109,0.117,0.108,0.097,0.032,0.111,0.11,0.032,0.115,0.104,0.097,0.114,0.107,0.032,0.116,0.097,0.11,0.107,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.075,0.101,0.118,0.105,0.11,0.032,0.097,0.11,0.1,0.032,0.077,0.097,0.114,0.107,0.058,0.032,0.089,0.111,0.117,0.032,0.097,0.11,0.1,0.032,0.121,0.111,0.117,0.114,0.032,0.119,0.105,0.102,0.101,0.032,0.119,0.105,0.108,0.108,0.032,0.108,0.111,0.118,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.115,0.116,0.117,0.102,0.102,0.032,0.045,0.032,0.105,0.116,0.115,0.032,0.117,0.11,0.114,0.101,0.097,0.108,0.032,0.097,0.11,0.1,0.032,0.119,0.111,0.114,0.107,0.115,0.032,0.101,0.118,0.101,0.114,0.121,0.116,0.105,0.109,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.097,0.108,0.119,0.097,0.121,0.115,0.032,0.1,0.105,0.115,0.099,0.111,0.11,0.116,0.105,0.11,0.117,0.101,0.032,0.066,0.121,0.032,0.071,0.111,0.105,0.11,0.103,0.032,0.072,0.101,0.114,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.066,0.121,0.032,0.119,0.114,0.105,0.116,0.105,0.11,0.103,0.032,0.121,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.1,0.111,0.032,0.105,0.116,0.032,0.104,0.101,0.114,0.101,0.032,0.08,0.079,0.032,0.066,0.111,0.12,0.032,0.05,0.054,0.051,0.056,0.051,0.032,0.05,0.055,0.048,0.048,0.032,0.076,0.111,0.117,0.105,0.115,0.105,0.097,0.11,0.097,0.032,0.065,0.118,0.101,0.046,0.032,0.083,0.046,0.032,0.077,0.105,0.11,0.11,0.101,0.097,0.112,0.111,0.108,0.105,0.115,0.044,0.032,0.077,0.078,0.032,0.053,0.053,0.052,0.05,0.054,0.01,0.01,0.079,0.114,0.032,0.116,0.104,0.105,0.115,0.032,0.105,0.115,0.032,0.097,0.11,0.111,0.116,0.104,0.101,0.114,0.032,0.119,0.097,0.121,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.049,0.057,0.072,0.097,0.114,0.108,0.101,0.109,0.083,0.116,0.035,0.049,0.068,0.111,0.114,0.099,0.104,0.101,0.115,0.116,0.101,0.114,0.077,0.065,0.048,0.05,0.049,0.05,0.049,0.052,0.049,0.049,0.052,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.102,0.102,0.101,0.114,0.115,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.098,0.108,0.105,0.099,0.032,0.097,0.032,0.103,0.108,0.105,0.109,0.112,0.115,0.101,0.032,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.102,0.105,0.1,0.102,0.1,0.101,0.11,0.116,0.105,0.097,0.108,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.115,0.032,0.098,0.101,0.116,0.119,0.101,0.101,0.11,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.11,0.1,0.032,0.067,0.111,0.104,0.101,0.11,0.044,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.032,0.099,0.111,0.11,0.102,0.105,0.114,0.109,0.115,0.032,0.116,0.104,0.101,0.032,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.11,0.111,0.119,0.032,0.111,0.099,0.099,0.117,0.112,0.105,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.079,0.118,0.097,0.108,0.032,0.079,0.102,0.102,0.105,0.099,0.101,0.032,0.104,0.097,0.1,0.032,0.099,0.111,0.11,0.116,0.101,0.109,0.112,0.111,0.114,0.097,0.11,0.101,0.111,0.117,0.115,0.032,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.032,0.111,0.102,0.032,0.097,0.032,0.112,0.114,0.111,0.112,0.111,0.115,0.097,0.108,0.032,0.116,0.111,0.032,0.098,0.117,0.1,0.121,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.111,0.102,0.032,0.075,0.097,0.114,0.101,0.11,0.032,0.077,0.099,0.068,0.111,0.117,0.103,0.097,0.108,0.044,0.032,0.097,0.032,0.119,0.111,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.104,0.097,0.115,0.032,0.097,0.108,0.108,0.101,0.103,0.101,0.1,0.032,0.115,0.104,0.101,0.032,0.104,0.097,0.1,0.032,0.097,0.11,0.032,0.101,0.12,0.116,0.114,0.097,0.109,0.097,0.114,0.105,0.116,0.097,0.108,0.032,0.097,0.102,0.102,0.097,0.105,0.114,0.032,0.119,0.105,0.116,0.104,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.097,0.032,0.1,0.101,0.099,0.097,0.1,0.101,0.032,0.097,0.103,0.111,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.111,0.108,0.1,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.104,0.105,0.115,0.032,0.112,0.108,0.097,0.11,0.115,0.032,0.116,0.111,0.032,0.115,0.101,0.116,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.097,0.11,0.1,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.101,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.114,0.099,0.104,0.097,0.1,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.102,0.114,0.111,0.109,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.044,0.032,0.119,0.104,0.105,0.099,0.104,0.032,0.112,0.117,0.098,0.108,0.105,0.115,0.104,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.078,0.097,0.116,0.105,0.111,0.11,0.097,0.108,0.032,0.069,0.11,0.113,0.117,0.105,0.114,0.101,0.114,0.046,0.032,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.099,0.097,0.112,0.116,0.117,0.114,0.101,0.115,0.032,0.119,0.104,0.097,0.116,0.032,0.097,0.112,0.112,0.101,0.097,0.114,0.115,0.032,0.116,0.111,0.032,0.098,0.101,0.032,0.097,0.032,0.114,0.111,0.117,0.116,0.105,0.11,0.101,0.032,0.098,0.117,0.115,0.105,0.11,0.101,0.115,0.115,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.032,0.111,0.102,0.032,0.115,0.101,0.118,0.101,0.114,0.097,0.108,0.032,0.109,0.097,0.116,0.116,0.101,0.114,0.115,0.032,0.111,0.11,0.032,0.116,0.104,0.101,0.105,0.114,0.032,0.097,0.103,0.101,0.11,0.1,0.097,0.046,0.032,0.084,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.116,0.104,0.101,0.032,0.109,0.101,0.097,0.11,0.105,0.11,0.103,0.032,0.111,0.102,0.032,0.084,0.114,0.117,0.109,0.112,0.039,0.115,0.032,0.117,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.119,0.111,0.114,0.1,0.032,0.034,0.034,0.099,0.097,0.102,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.105,0.115,0.032,0.1,0.105,0.115,0.112,0.117,0.116,0.101,0.1,0.032,0.098,0.121,0.032,0.116,0.104,0.101,0.032,0.116,0.119,0.111,0.032,0.115,0.105,0.1,0.101,0.115,0.046,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.073,0.032,0.11,0.101,0.101,0.1,0.032,0.116,0.111,0.032,0.111,0.112,0.101,0.11,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.102,0.111,0.114,0.032,0.116,0.104,0.101,0.032,0.116,0.114,0.097,0.11,0.115,0.102,0.101,0.114,0.032,0.111,0.102,0.032,0.097,0.108,0.108,0.032,0.111,0.102,0.032,0.116,0.104,0.097,0.116,0.032,0.105,0.11,0.102,0.111,0.032,0.114,0.101,0.103,0.097,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.117,0.114,0.032,0.102,0.114,0.105,0.101,0.11,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.105,0.1,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.044,0.032,0.108,0.105,0.107,0.101,0.108,0.121,0.032,0.097,0.032,0.114,0.101,0.102,0.101,0.114,0.101,0.11,0.099,0.101,0.032,0.116,0.111,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.032,0.104,0.101,0.097,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.032,0.1,0.08,0.101,0.1,0.099,0.107,0.101,0.114,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.032,0.099,0.111,0.109,0.101,0.115,0.032,0.117,0.112,0.032,0.097,0.103,0.097,0.105,0.11,0.032,0.108,0.097,0.116,0.101,0.114,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.044,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.11,0.116,0.101,0.114,0.114,0.117,0.112,0.116,0.115,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.097,0.115,0.107,0.105,0.11,0.103,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.104,0.097,0.116,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.063,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.097,0.099,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.101,0.108,0.108,0.115,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.101,0.038,0.035,0.051,0.057,0.059,0.108,0.108,0.032,0.104,0.097,0.118,0.101,0.032,0.116,0.111,0.032,0.112,0.097,0.121,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.115,0.032,0.104,0.101,0.097,0.114,0.1,0.032,0.115,0.097,0.121,0.105,0.11,0.103,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.112,0.097,0.121,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.1,0.097,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.117,0.11,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.101,0.116,0.104,0.101,0.114,0.032,0.104,0.101,0.032,0.115,0.117,0.103,0.103,0.101,0.115,0.116,0.115,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.097,0.102,0.115,0.104,0.032,0.111,0.114,0.032,0.11,0.111,0.116,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.121,0.115,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.11,0.111,0.044,0.032,0.11,0.111,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.105,0.116,0.032,0.105,0.115,0.032,0.11,0.111,0.116,0.032,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.097,0.116,0.032,0.105,0.115,0.032,0.115,0.097,0.105,0.1,0.032,0.11,0.101,0.12,0.116,0.046,0.032,0.078,0.111,0.032,0.112,0.097,0.121,0.109,0.101,0.11,0.116,0.032,0.119,0.097,0.115,0.032,0.101,0.118,0.101,0.114,0.032,0.109,0.097,0.1,0.101,0.032,0.102,0.114,0.111,0.109,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.082,0.117,0.1,0.121,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.044,0.032,0.116,0.104,0.101,0.032,0.08,0.114,0.101,0.115,0.105,0.1,0.101,0.11,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.097,0.116,0.116,0.111,0.114,0.11,0.101,0.121,0.044,0.032,0.104,0.097,0.115,0.032,0.115,0.097,0.105,0.1,0.046,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.032,0.104,0.097,0.115,0.032,0.112,0.114,0.101,0.118,0.105,0.111,0.117,0.115,0.108,0.121,0.032,0.097,0.099,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.1,0.032,0.116,0.104,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.101,0.1,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.032,0.114,0.101,0.108,0.097,0.116,0.101,0.1,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.098,0.117,0.1,0.102,0.121,0.105,0.11,0.103,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.046,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01'.split(',').map(v => v * 1);
const trainingData = [];
for (let i = 0; i < 1000; i++) {
trainingData.push({
input: shuffleArray(trainingInputSample),
output: { [Math.random() > 0.51 ? 'spam' : 'legit']: [1] }
// output: Math.random() > 0.51 ? 'spam' : 'legit'
})
}
const net = new NeuralNetwork();
const result = net.train(trainingData, { iterations: 10, log: true, callbackPeriod: 1 });
console.log(net.run(shuffleArray(trainingInputSample)));
function shuffleArray(array) {
array = array.slice(0);
for (let i = array.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
const temp = array[i];
array[i] = array[j];
array[j] = temp;
}
return array;
} |
I currently use this piece of code for creating a vector based on this pseudo-code. // Define some variables
let vectorDimensions = 5
let vocabulary = ['the', 'quick', 'brown', 'fox'];
// Create a vector
vector = new Array(vectorDimensions).fill(0);
// Loop over each word in our list
await asyncForEach(vocabulary, async (feature) => {
let h = fhash(crypto.createHash('sha512').update(feature).digest('hex'));
let idx = h % vectorDimensions;
vector[idx] += 1
}); note that |
What is wrong?
I'm trying to train a NN using the
NeuralNetworkGPU
class, however, it likes to eat all the memory from my PC (24GiB of which 19.3GiB usable for the script), which is fine... if it where to just work...As the title of the issue says, at some point, the script dies with an
Array buffer allocation failed
error.This means that it has run out of RAM to use but I wanted to know if theres a way to not hit into this issue (considering I already have a decent amount of memory and can't easily add more, hooray finances).
Where does it happen?
Training a NN using my PC with
NeuralNetworkGPU
How do we replicate the issue?
How important is this (1-5)?
2
Expected behavior (i.e. solution)
The script not crashing (I guess?)
Other Comments
Stacktrace:
The text was updated successfully, but these errors were encountered: