Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Array buffer allocation failed using NeuralNetworkGPU #434

Closed
FinlayDaG33k opened this issue Aug 18, 2019 · 37 comments
Closed

Array buffer allocation failed using NeuralNetworkGPU #434

FinlayDaG33k opened this issue Aug 18, 2019 · 37 comments

Comments

@FinlayDaG33k
Copy link
Contributor

afbeelding

What is wrong?

I'm trying to train a NN using the NeuralNetworkGPU class, however, it likes to eat all the memory from my PC (24GiB of which 19.3GiB usable for the script), which is fine... if it where to just work...
As the title of the issue says, at some point, the script dies with an Array buffer allocation failed error.

This means that it has run out of RAM to use but I wanted to know if theres a way to not hit into this issue (considering I already have a decent amount of memory and can't easily add more, hooray finances).

Where does it happen?

Training a NN using my PC with NeuralNetworkGPU

How do we replicate the issue?

  • Train a neural network (I guess?)

How important is this (1-5)?

2

Expected behavior (i.e. solution)

The script not crashing (I guess?)

Other Comments

Stacktrace:

(node:15708) UnhandledPromiseRejectionWarning: RangeError: Array buffer allocation failed
    at new ArrayBuffer (<anonymous>)
    at new Float32Array (<anonymous>)
    at zeros (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\utilities\zeros.js:2:10)
    at NeuralNetworkGPU.initialize (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network.js:104:39)
    at NeuralNetworkGPU.initialize (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network-gpu.js:127:11)
    at NeuralNetworkGPU.verifyIsInitialized (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network.js:295:10)
    at NeuralNetworkGPU.prepTraining (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network-gpu.js:436:10)
    at NeuralNetworkGPU.train (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network.js:467:39)
    at train (F:\FinlayDaG33k\brainspam\src\train.js:107:7)
@robertleeplummerjr
Copy link
Contributor

Can you share any of your scripting? I feel like this is a memory leak.

@FinlayDaG33k
Copy link
Contributor Author

Of course.

The script can be found here.

@robertleeplummerjr
Copy link
Contributor

What about thee training datasets? And how many are there?

@FinlayDaG33k
Copy link
Contributor Author

FinlayDaG33k commented Aug 19, 2019

The samples look like this

The dataset for spammails I'm using is around 1389 samples big (and that for legit mails are 130).

@robertleeplummerjr
Copy link
Contributor

Are you encoding them? It would be very odd to feed this type of data into NeuralNetworkGPU.

@FinlayDaG33k
Copy link
Contributor Author

FinlayDaG33k commented Aug 19, 2019

No, I'm just slapping them into an array and passing that to NeuralNetworkGPU

@robertleeplummerjr
Copy link
Contributor

I would normalize or encode the data first, then feed them in. If you are using NeuralNetworkGPU, all the inputs need to be the same size, and all the outputs need to be the same size. Try with 2.0.0 alpha 6, released earlier today, and let me know if it is any better.

@FinlayDaG33k
Copy link
Contributor Author

I normalize all the data in my set using the following snippet now:

function (string){
  var input = [];
  for(let i=0; i<string.length; i++){
    input.push(string.charCodeAt(i)/1000);
  }
  return input;
}

The RAM is still climbing but not remotely as sharp anymore.
I'll let this run for a while (with the alpha 6) to see what happens.

@FinlayDaG33k
Copy link
Contributor Author

Unfortunately, after a little while, it starts to climb up hard again and crashes again with the same error :\

@FinlayDaG33k
Copy link
Contributor Author

I have currently given up on this project since I cannot find a way to get around the memory issue with even 128GB of RAM :\

@robertleeplummerjr
Copy link
Contributor

Do you have 128GB of GPU memory?

@robertleeplummerjr
Copy link
Contributor

robertleeplummerjr commented Sep 3, 2019

I consider this a high priority issue, will be giving it more attention when this lands: stackgl/headless-gl#168

@FinlayDaG33k
Copy link
Contributor Author

no, 128GB system RAM ("CPU RAM") on a machine at work (which GPU has 8GB).
The weird part is that decreasing the amount of training samples doesn't do anything.

I have tried to look into "The Hashing Trick" but I can't make sense of that (maybe Brain.JS could do something with it?).

@robertleeplummerjr
Copy link
Contributor

I'll look at this tomorrow morning, GPU problems are highest priority.

@FinlayDaG33k
Copy link
Contributor Author

I'll try to see what happens if I use brain.NeuralNetwork() instead to see if that changes anything.

@FinlayDaG33k
Copy link
Contributor Author

Same with CPU only...

After normalization, the samples look like this:

0.065,0.066,0.067,0.032,0.077,0.069,0.078,0.083,0.032,0.072,0.069,0.065,0.076,0.084,0.072,0.032,0.082,0.069,0.08,0.079,0.082,0.084,0.01,0.01,0.01,0.01,0.074,0.117,0.108,0.121,0.044,0.032,0.05,0.053,0.116,0.104,0.032,0.05,0.048,0.049,0.056,0.01,0.069,0.118,0.101,0.11,0.105,0.11,0.103,0.032,0.082,0.101,0.112,0.111,0.114,0.116,0.032,0.045,0.032,0.072,0.101,0.097,0.108,0.116,0.104,0.121,0.032,0.077,0.101,0.11,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.109,0.105,0.103,0.104,0.116,0.032,0.098,0.101,0.032,0.051,0.048,0.043,0.032,0.105,0.11,0.032,0.097,0.103,0.101,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.114,0.101,0.032,0.105,0.115,0.032,0.11,0.111,0.032,0.114,0.101,0.097,0.115,0.111,0.11,0.032,0.121,0.111,0.117,0.032,0.097,0.114,0.101,0.11,0.116,0.032,0.119,0.097,0.107,0.105,0.11,0.103,0.032,0.117,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.111,0.114,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.115,0.108,0.101,0.101,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.097,0.032,0.115,0.116,0.105,0.102,0.102,0.045,0.101,0.114,0.101,0.099,0.116,0.105,0.111,0.11,0.046,0.01,0.01,0.01,0.01,0.073,0.116,0.032,0.103,0.101,0.116,0.115,0.032,0.121,0.111,0.117,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.102,0.097,0.115,0.116,0.032,0.097,0.11,0.1,0.032,0.107,0.101,0.101,0.112,0.115,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.097,0.108,0.108,0.032,0.11,0.105,0.103,0.104,0.116,0.032,0.108,0.111,0.11,0.103,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.076,0.111,0.111,0.107,0.032,0.11,0.111,0.119,0.032,0.116,0.111,0.032,0.115,0.101,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.102,0.111,0.114,0.117,0.109,0.117,0.108,0.097,0.032,0.111,0.11,0.032,0.115,0.104,0.097,0.114,0.107,0.032,0.116,0.097,0.11,0.107,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.075,0.101,0.118,0.105,0.11,0.032,0.097,0.11,0.1,0.032,0.077,0.097,0.114,0.107,0.058,0.032,0.089,0.111,0.117,0.032,0.097,0.11,0.1,0.032,0.121,0.111,0.117,0.114,0.032,0.119,0.105,0.102,0.101,0.032,0.119,0.105,0.108,0.108,0.032,0.108,0.111,0.118,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.115,0.116,0.117,0.102,0.102,0.032,0.045,0.032,0.105,0.116,0.115,0.032,0.117,0.11,0.114,0.101,0.097,0.108,0.032,0.097,0.11,0.1,0.032,0.119,0.111,0.114,0.107,0.115,0.032,0.101,0.118,0.101,0.114,0.121,0.116,0.105,0.109,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.097,0.108,0.119,0.097,0.121,0.115,0.032,0.1,0.105,0.115,0.099,0.111,0.11,0.116,0.105,0.11,0.117,0.101,0.032,0.066,0.121,0.032,0.071,0.111,0.105,0.11,0.103,0.032,0.072,0.101,0.114,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.066,0.121,0.032,0.119,0.114,0.105,0.116,0.105,0.11,0.103,0.032,0.121,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.1,0.111,0.032,0.105,0.116,0.032,0.104,0.101,0.114,0.101,0.032,0.08,0.079,0.032,0.066,0.111,0.12,0.032,0.05,0.054,0.051,0.056,0.051,0.032,0.05,0.055,0.048,0.048,0.032,0.076,0.111,0.117,0.105,0.115,0.105,0.097,0.11,0.097,0.032,0.065,0.118,0.101,0.046,0.032,0.083,0.046,0.032,0.077,0.105,0.11,0.11,0.101,0.097,0.112,0.111,0.108,0.105,0.115,0.044,0.032,0.077,0.078,0.032,0.053,0.053,0.052,0.05,0.054,0.01,0.01,0.079,0.114,0.032,0.116,0.104,0.105,0.115,0.032,0.105,0.115,0.032,0.097,0.11,0.111,0.116,0.104,0.101,0.114,0.032,0.119,0.097,0.121,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.049,0.057,0.072,0.097,0.114,0.108,0.101,0.109,0.083,0.116,0.035,0.049,0.068,0.111,0.114,0.099,0.104,0.101,0.115,0.116,0.101,0.114,0.077,0.065,0.048,0.05,0.049,0.05,0.049,0.052,0.049,0.049,0.052,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.102,0.102,0.101,0.114,0.115,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.098,0.108,0.105,0.099,0.032,0.097,0.032,0.103,0.108,0.105,0.109,0.112,0.115,0.101,0.032,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.102,0.105,0.1,0.102,0.1,0.101,0.11,0.116,0.105,0.097,0.108,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.115,0.032,0.098,0.101,0.116,0.119,0.101,0.101,0.11,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.11,0.1,0.032,0.067,0.111,0.104,0.101,0.11,0.044,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.032,0.099,0.111,0.11,0.102,0.105,0.114,0.109,0.115,0.032,0.116,0.104,0.101,0.032,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.11,0.111,0.119,0.032,0.111,0.099,0.099,0.117,0.112,0.105,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.079,0.118,0.097,0.108,0.032,0.079,0.102,0.102,0.105,0.099,0.101,0.032,0.104,0.097,0.1,0.032,0.099,0.111,0.11,0.116,0.101,0.109,0.112,0.111,0.114,0.097,0.11,0.101,0.111,0.117,0.115,0.032,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.032,0.111,0.102,0.032,0.097,0.032,0.112,0.114,0.111,0.112,0.111,0.115,0.097,0.108,0.032,0.116,0.111,0.032,0.098,0.117,0.1,0.121,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.111,0.102,0.032,0.075,0.097,0.114,0.101,0.11,0.032,0.077,0.099,0.068,0.111,0.117,0.103,0.097,0.108,0.044,0.032,0.097,0.032,0.119,0.111,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.104,0.097,0.115,0.032,0.097,0.108,0.108,0.101,0.103,0.101,0.1,0.032,0.115,0.104,0.101,0.032,0.104,0.097,0.1,0.032,0.097,0.11,0.032,0.101,0.12,0.116,0.114,0.097,0.109,0.097,0.114,0.105,0.116,0.097,0.108,0.032,0.097,0.102,0.102,0.097,0.105,0.114,0.032,0.119,0.105,0.116,0.104,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.097,0.032,0.1,0.101,0.099,0.097,0.1,0.101,0.032,0.097,0.103,0.111,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.111,0.108,0.1,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.104,0.105,0.115,0.032,0.112,0.108,0.097,0.11,0.115,0.032,0.116,0.111,0.032,0.115,0.101,0.116,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.097,0.11,0.1,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.101,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.114,0.099,0.104,0.097,0.1,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.102,0.114,0.111,0.109,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.044,0.032,0.119,0.104,0.105,0.099,0.104,0.032,0.112,0.117,0.098,0.108,0.105,0.115,0.104,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.078,0.097,0.116,0.105,0.111,0.11,0.097,0.108,0.032,0.069,0.11,0.113,0.117,0.105,0.114,0.101,0.114,0.046,0.032,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.099,0.097,0.112,0.116,0.117,0.114,0.101,0.115,0.032,0.119,0.104,0.097,0.116,0.032,0.097,0.112,0.112,0.101,0.097,0.114,0.115,0.032,0.116,0.111,0.032,0.098,0.101,0.032,0.097,0.032,0.114,0.111,0.117,0.116,0.105,0.11,0.101,0.032,0.098,0.117,0.115,0.105,0.11,0.101,0.115,0.115,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.032,0.111,0.102,0.032,0.115,0.101,0.118,0.101,0.114,0.097,0.108,0.032,0.109,0.097,0.116,0.116,0.101,0.114,0.115,0.032,0.111,0.11,0.032,0.116,0.104,0.101,0.105,0.114,0.032,0.097,0.103,0.101,0.11,0.1,0.097,0.046,0.032,0.084,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.116,0.104,0.101,0.032,0.109,0.101,0.097,0.11,0.105,0.11,0.103,0.032,0.111,0.102,0.032,0.084,0.114,0.117,0.109,0.112,0.039,0.115,0.032,0.117,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.119,0.111,0.114,0.1,0.032,0.034,0.034,0.099,0.097,0.102,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.105,0.115,0.032,0.1,0.105,0.115,0.112,0.117,0.116,0.101,0.1,0.032,0.098,0.121,0.032,0.116,0.104,0.101,0.032,0.116,0.119,0.111,0.032,0.115,0.105,0.1,0.101,0.115,0.046,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.073,0.032,0.11,0.101,0.101,0.1,0.032,0.116,0.111,0.032,0.111,0.112,0.101,0.11,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.102,0.111,0.114,0.032,0.116,0.104,0.101,0.032,0.116,0.114,0.097,0.11,0.115,0.102,0.101,0.114,0.032,0.111,0.102,0.032,0.097,0.108,0.108,0.032,0.111,0.102,0.032,0.116,0.104,0.097,0.116,0.032,0.105,0.11,0.102,0.111,0.032,0.114,0.101,0.103,0.097,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.117,0.114,0.032,0.102,0.114,0.105,0.101,0.11,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.105,0.1,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.044,0.032,0.108,0.105,0.107,0.101,0.108,0.121,0.032,0.097,0.032,0.114,0.101,0.102,0.101,0.114,0.101,0.11,0.099,0.101,0.032,0.116,0.111,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.032,0.104,0.101,0.097,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.032,0.1,0.08,0.101,0.1,0.099,0.107,0.101,0.114,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.032,0.099,0.111,0.109,0.101,0.115,0.032,0.117,0.112,0.032,0.097,0.103,0.097,0.105,0.11,0.032,0.108,0.097,0.116,0.101,0.114,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.044,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.11,0.116,0.101,0.114,0.114,0.117,0.112,0.116,0.115,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.097,0.115,0.107,0.105,0.11,0.103,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.104,0.097,0.116,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.063,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.097,0.099,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.101,0.108,0.108,0.115,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.101,0.038,0.035,0.051,0.057,0.059,0.108,0.108,0.032,0.104,0.097,0.118,0.101,0.032,0.116,0.111,0.032,0.112,0.097,0.121,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.115,0.032,0.104,0.101,0.097,0.114,0.1,0.032,0.115,0.097,0.121,0.105,0.11,0.103,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.112,0.097,0.121,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.1,0.097,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.117,0.11,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.101,0.116,0.104,0.101,0.114,0.032,0.104,0.101,0.032,0.115,0.117,0.103,0.103,0.101,0.115,0.116,0.115,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.097,0.102,0.115,0.104,0.032,0.111,0.114,0.032,0.11,0.111,0.116,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.121,0.115,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.11,0.111,0.044,0.032,0.11,0.111,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.105,0.116,0.032,0.105,0.115,0.032,0.11,0.111,0.116,0.032,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.097,0.116,0.032,0.105,0.115,0.032,0.115,0.097,0.105,0.1,0.032,0.11,0.101,0.12,0.116,0.046,0.032,0.078,0.111,0.032,0.112,0.097,0.121,0.109,0.101,0.11,0.116,0.032,0.119,0.097,0.115,0.032,0.101,0.118,0.101,0.114,0.032,0.109,0.097,0.1,0.101,0.032,0.102,0.114,0.111,0.109,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.082,0.117,0.1,0.121,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.044,0.032,0.116,0.104,0.101,0.032,0.08,0.114,0.101,0.115,0.105,0.1,0.101,0.11,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.097,0.116,0.116,0.111,0.114,0.11,0.101,0.121,0.044,0.032,0.104,0.097,0.115,0.032,0.115,0.097,0.105,0.1,0.046,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.032,0.104,0.097,0.115,0.032,0.112,0.114,0.101,0.118,0.105,0.111,0.117,0.115,0.108,0.121,0.032,0.097,0.099,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.1,0.032,0.116,0.104,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.101,0.1,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.032,0.114,0.101,0.108,0.097,0.116,0.101,0.1,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.098,0.117,0.1,0.102,0.121,0.105,0.11,0.103,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.046,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01

Though the sizes might vary.

@robertleeplummerjr
Copy link
Contributor

Can I get you to post sample data again? The link expired.

@robertleeplummerjr
Copy link
Contributor

This line seems to be erroneous, as dataset seems to be a string, and NeuralNetwork and NeuralNetworkGPU only support arrays on numbers.

// Push the content into our trainingData
    trainingData.push({
      input: content,
      output: dataset
    });

@robertleeplummerjr
Copy link
Contributor

I think I'm on to something, this is a key indicator:

    at new Float32Array (<anonymous>)
    at zeros (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\utilities\zeros.js:2:10)
    at NeuralNetworkGPU.initialize (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network.js:104:39)
    at NeuralNetworkGPU.initialize (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network-gpu.js:127:11)
    at NeuralNetworkGPU.verifyIsInitialized (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network.js:295:10)
    at NeuralNetworkGPU.prepTraining (F:\FinlayDaG33k\brainspam\node_modules\brain.js\src\neural-network-gpu.js:436:10)

The issue happens in what seems to be initial setup of the app, copying data so that it is a format that will not leak memory, but ironically... it may leak memory. Still investigating.

@robertleeplummerjr
Copy link
Contributor

If you start out in NeuralNetwork or NeuralNetworkGPU with Float32Array's it shouldn't need to copy them to Float32Array's on initial setup, just fyi.

@robertleeplummerjr
Copy link
Contributor

I get nothing but stable nets when the network is running, so this beacons the question: How much training data is there in megabytes?

@FinlayDaG33k
Copy link
Contributor Author

With the latest version on my master, the dataset is about 6.2MB for the spam dataset and 952KB for the legit dataset.
This is without the normalization.

With the latest version on my fix-memory-issue (which does the normalization thing) branch this increases to 23.1MB for the spam and 4.34MB for the legit

My variables in this piece of code are not really intuitive but content is the actual sample and dataset is just whether it's part of the spam or legit dataset.

trainingData.push({
  input: content,
  output: dataset
});

@FinlayDaG33k
Copy link
Contributor Author

FinlayDaG33k commented Sep 5, 2019

What happens in my code:
I take some spam mail from Thunderbird and some legit mail and put those in datasets/raw/spam and datasets/raw/legit respectively.
After this, I run npm run prepare which fires up src/prepare.js.

prepare.js loops over each file in the datasets/raw/spam and datasets/raw/legit, reads and parses the file content, hashes the body of the mail (lazy way to get unique filenames) and saves the body (basically stripping away anything that is not the body of the mail) to datasets/prepared/<dataset>/<hash>.txt (where <hash> obviously is that hash I just talked about and is the respective dataset).
Depending on whether it's the master branch (which doesn't convert each character to a float) or the fix-memory-issue branch (which does convert each character to a float, check) the content of each "prepared" sample is the raw "human readable" version of the body or a huge amount of comma-seperated floats (like seen in my previous reply.

After this is done, I run npm run train which calls upon the mighty src/train.js.
train.js loops over each file in the prepared datasets, it creates a new object with input being the content of the sample (just as is, no extra magic here) and output being the dataset it belongs to (eg. spam or legit).
It then pushes this object into the array trainingData.

After loading up that, it will call net.train() (net is defined as new brain.NeuralNetworkGPU()) passing the trainingData array, as well as some settings (like the amount of iterations).
This is where I can see my ram slowly increase (followed by a massive increase) and as a result, the script crashing.

I hope I managed to clear up a bit what my code does :)

@robertleeplummerjr
Copy link
Contributor

robertleeplummerjr commented Sep 5, 2019

How long is the longest email string?

@FinlayDaG33k
Copy link
Contributor Author

After this normalization, the longest one (guesstimated by just looking at the biggest samples) are 546KB (for spam) and 949KB (for legit).
For sake of simplicity, I'll just go for the dataset with this "normalization" applied (even though it results in much bigger amounts of bytes) from now on.

@FinlayDaG33k
Copy link
Contributor Author

@robertleeplummerjr
Copy link
Contributor

That is the problem. You are creating a net with 95,282 inputs. Have you tried a LSTM network with this type of data?

@robertleeplummerjr
Copy link
Contributor

To be clear, that'll be 95,282 inputs, 47,641 hidden layers, (and since hidden layers are 2d, that is 95,282 x 47,641 = 4,539,329,762 numbers, wow!) and 1 output.

@robertleeplummerjr
Copy link
Contributor

@robertleeplummerjr
Copy link
Contributor

They actually use brain (the older deprecated one) with a bag of words technique.

@robertleeplummerjr
Copy link
Contributor

Since we know what is causing the net to grow so large, I'm going to go ahead and close. I'm interested in hearing how other techniques work out, however.

@FinlayDaG33k
Copy link
Contributor Author

Oh wow, no wonder that happens then no.
How did you come up with that amount of inputs and hiddens?

I'm already looking into BOW along with "The Hashing Trick" and will report back when I'm ready with that.
In the meantime, it might be a nice addition to add something that can do BOW in the library itself.
I doubt there aren't more people like me that want to do text classification

@robertleeplummerjr
Copy link
Contributor

I just counted the longest input you had (input neurons, or values), hidden layers (when not defined) are calculated here and it is a dot matrix multiplication, the hidden layer matrix size is height (input size) times width (hidden layer size), and the output layer I actually got wrong, it is 2, not 1.

@FinlayDaG33k
Copy link
Contributor Author

Ah I see.
I hope I can get this BOW method to work "soon".

robertleeplummerjr added a commit that referenced this issue Sep 6, 2019
saw nearly 10% improvement in speed in debugging #434
@robertleeplummerjr
Copy link
Contributor

Looking forward to it! Maybe we can include some of it in brain.js.

@robertleeplummerjr
Copy link
Contributor

For the record, here is what I was using to test, and actually find a slight performance improvement for the older NeuralNetwork network:

const { NeuralNetworkGPU, NeuralNetwork } = require('./src');
const trainingInputSample = '0.065,0.066,0.067,0.032,0.077,0.069,0.078,0.083,0.032,0.072,0.069,0.065,0.076,0.084,0.072,0.032,0.082,0.069,0.08,0.079,0.082,0.084,0.01,0.01,0.01,0.01,0.074,0.117,0.108,0.121,0.044,0.032,0.05,0.053,0.116,0.104,0.032,0.05,0.048,0.049,0.056,0.01,0.069,0.118,0.101,0.11,0.105,0.11,0.103,0.032,0.082,0.101,0.112,0.111,0.114,0.116,0.032,0.045,0.032,0.072,0.101,0.097,0.108,0.116,0.104,0.121,0.032,0.077,0.101,0.11,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.109,0.105,0.103,0.104,0.116,0.032,0.098,0.101,0.032,0.051,0.048,0.043,0.032,0.105,0.11,0.032,0.097,0.103,0.101,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.114,0.101,0.032,0.105,0.115,0.032,0.11,0.111,0.032,0.114,0.101,0.097,0.115,0.111,0.11,0.032,0.121,0.111,0.117,0.032,0.097,0.114,0.101,0.11,0.116,0.032,0.119,0.097,0.107,0.105,0.11,0.103,0.032,0.117,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.111,0.114,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.115,0.108,0.101,0.101,0.112,0.032,0.119,0.105,0.116,0.104,0.032,0.097,0.032,0.115,0.116,0.105,0.102,0.102,0.045,0.101,0.114,0.101,0.099,0.116,0.105,0.111,0.11,0.046,0.01,0.01,0.01,0.01,0.073,0.116,0.032,0.103,0.101,0.116,0.115,0.032,0.121,0.111,0.117,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.102,0.097,0.115,0.116,0.032,0.097,0.11,0.1,0.032,0.107,0.101,0.101,0.112,0.115,0.032,0.103,0.111,0.105,0.11,0.103,0.032,0.097,0.108,0.108,0.032,0.11,0.105,0.103,0.104,0.116,0.032,0.108,0.111,0.11,0.103,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.076,0.111,0.111,0.107,0.032,0.11,0.111,0.119,0.032,0.116,0.111,0.032,0.115,0.101,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.102,0.111,0.114,0.117,0.109,0.117,0.108,0.097,0.032,0.111,0.11,0.032,0.115,0.104,0.097,0.114,0.107,0.032,0.116,0.097,0.11,0.107,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.075,0.101,0.118,0.105,0.11,0.032,0.097,0.11,0.1,0.032,0.077,0.097,0.114,0.107,0.058,0.032,0.089,0.111,0.117,0.032,0.097,0.11,0.1,0.032,0.121,0.111,0.117,0.114,0.032,0.119,0.105,0.102,0.101,0.032,0.119,0.105,0.108,0.108,0.032,0.108,0.111,0.118,0.101,0.032,0.116,0.104,0.105,0.115,0.032,0.115,0.116,0.117,0.102,0.102,0.032,0.045,0.032,0.105,0.116,0.115,0.032,0.117,0.11,0.114,0.101,0.097,0.108,0.032,0.097,0.11,0.1,0.032,0.119,0.111,0.114,0.107,0.115,0.032,0.101,0.118,0.101,0.114,0.121,0.116,0.105,0.109,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.076,0.104,0.101,0.106,0.105,0.121,0.113,0.097,0.12,0.101,0.047,0.113,0.106,0.118,0.121,0.114,0.106,0.056,0.048,0.054,0.053,0.105,0.104,0.099,0.111,0.1,0.099,0.105,0.12,0.109,0.047,0.081,0.07,0.053,0.083,0.065,0.105,0.112,0.113,0.095,0.111,0.104,0.1,0.045,0.067,0.111,0.052,0.122,0.086,0.087,0.078,0.113,0.115,0.083,0.076,0.071,0.116,0.09,0.121,0.098,0.074,0.081,0.055,0.097,0.115,0.1,0.117,0.084,0.111,0.09,0.066,0.073,0.107,0.077,0.047,0.085,0.088,0.121,0.067,0.086,0.073,0.107,0.076,0.086,0.075,0.076,0.112,0.097,0.089,0.112,0.108,0.081,0.07,0.069,0.104,0.113,0.089,0.055,0.122,0.048,0.09,0.066,0.081,0.112,0.087,0.052,0.053,0.084,0.111,0.12,0.119,0.106,0.075,0.11,0.051,0.079,0.085,0.052,0.081,0.09,0.12,0.067,0.072,0.089,0.103,0.119,0.09,0.052,0.108,0.088,0.055,0.065,0.072,0.105,0.073,0.121,0.077,0.048,0.045,0.121,0.101,0.102,0.05,0.069,0.08,0.12,0.082,0.076,0.081,0.114,0.073,0.053,0.07,0.099,0.075,0.077,0.12,0.057,0.117,0.113,0.087,0.12,0.067,0.05,0.049,0.087,0.077,0.076,0.117,0.074,0.077,0.118,0.098,0.066,0.09,0.051,0.116,0.049,0.106,0.067,0.055,0.048,0.062,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.089,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.097,0.108,0.119,0.097,0.121,0.115,0.032,0.1,0.105,0.115,0.099,0.111,0.11,0.116,0.105,0.11,0.117,0.101,0.032,0.066,0.121,0.032,0.071,0.111,0.105,0.11,0.103,0.032,0.072,0.101,0.114,0.101,0.01,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.066,0.121,0.032,0.119,0.114,0.105,0.116,0.105,0.11,0.103,0.032,0.121,0.111,0.117,0.032,0.099,0.097,0.11,0.032,0.1,0.111,0.032,0.105,0.116,0.032,0.104,0.101,0.114,0.101,0.032,0.08,0.079,0.032,0.066,0.111,0.12,0.032,0.05,0.054,0.051,0.056,0.051,0.032,0.05,0.055,0.048,0.048,0.032,0.076,0.111,0.117,0.105,0.115,0.105,0.097,0.11,0.097,0.032,0.065,0.118,0.101,0.046,0.032,0.083,0.046,0.032,0.077,0.105,0.11,0.11,0.101,0.097,0.112,0.111,0.108,0.105,0.115,0.044,0.032,0.077,0.078,0.032,0.053,0.053,0.052,0.05,0.054,0.01,0.01,0.079,0.114,0.032,0.116,0.104,0.105,0.115,0.032,0.105,0.115,0.032,0.097,0.11,0.111,0.116,0.104,0.101,0.114,0.032,0.119,0.097,0.121,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.01,0.06,0.104,0.116,0.116,0.112,0.058,0.047,0.047,0.119,0.119,0.119,0.046,0.109,0.105,0.107,0.101,0.112,0.105,0.112,0.115,0.046,0.1,0.097,0.116,0.101,0.047,0.115,0.098,0.117,0.103,0.047,0.048,0.055,0.067,0.106,0.049,0.116,0.051,0.09,0.066,0.098,0.118,0.077,0.074,0.117,0.076,0.077,0.087,0.049,0.05,0.067,0.12,0.087,0.113,0.117,0.057,0.12,0.077,0.075,0.099,0.07,0.053,0.073,0.114,0.081,0.076,0.082,0.12,0.08,0.069,0.05,0.102,0.101,0.121,0.045,0.048,0.077,0.121,0.073,0.105,0.072,0.065,0.055,0.088,0.108,0.052,0.09,0.119,0.103,0.089,0.072,0.067,0.12,0.09,0.081,0.052,0.085,0.079,0.051,0.11,0.075,0.106,0.119,0.12,0.111,0.084,0.053,0.052,0.087,0.112,0.081,0.066,0.09,0.048,0.122,0.055,0.089,0.113,0.104,0.069,0.07,0.081,0.108,0.112,0.089,0.097,0.112,0.076,0.075,0.086,0.076,0.107,0.073,0.086,0.067,0.121,0.088,0.085,0.046,0.077,0.107,0.073,0.066,0.09,0.111,0.084,0.117,0.1,0.115,0.097,0.055,0.081,0.074,0.098,0.121,0.09,0.116,0.071,0.076,0.083,0.115,0.113,0.078,0.087,0.086,0.122,0.052,0.111,0.067,0.045,0.1,0.104,0.111,0.095,0.113,0.112,0.105,0.065,0.083,0.053,0.07,0.081,0.062,0.01,0.049,0.057,0.072,0.097,0.114,0.108,0.101,0.109,0.083,0.116,0.035,0.049,0.068,0.111,0.114,0.099,0.104,0.101,0.115,0.116,0.101,0.114,0.077,0.065,0.048,0.05,0.049,0.05,0.049,0.052,0.049,0.049,0.052,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.102,0.102,0.101,0.114,0.115,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.098,0.108,0.105,0.099,0.032,0.097,0.032,0.103,0.108,0.105,0.109,0.112,0.115,0.101,0.032,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.102,0.105,0.1,0.102,0.1,0.101,0.11,0.116,0.105,0.097,0.108,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.115,0.032,0.098,0.101,0.116,0.119,0.101,0.101,0.11,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.11,0.1,0.032,0.067,0.111,0.104,0.101,0.11,0.044,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.032,0.099,0.111,0.11,0.102,0.105,0.114,0.109,0.115,0.032,0.116,0.104,0.101,0.032,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.11,0.111,0.119,0.032,0.111,0.099,0.099,0.117,0.112,0.105,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.079,0.118,0.097,0.108,0.032,0.079,0.102,0.102,0.105,0.099,0.101,0.032,0.104,0.097,0.1,0.032,0.099,0.111,0.11,0.116,0.101,0.109,0.112,0.111,0.114,0.097,0.11,0.101,0.111,0.117,0.115,0.032,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.032,0.111,0.102,0.032,0.097,0.032,0.112,0.114,0.111,0.112,0.111,0.115,0.097,0.108,0.032,0.116,0.111,0.032,0.098,0.117,0.1,0.121,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.111,0.102,0.032,0.075,0.097,0.114,0.101,0.11,0.032,0.077,0.099,0.068,0.111,0.117,0.103,0.097,0.108,0.044,0.032,0.097,0.032,0.119,0.111,0.109,0.097,0.11,0.032,0.119,0.104,0.111,0.032,0.104,0.097,0.115,0.032,0.097,0.108,0.108,0.101,0.103,0.101,0.1,0.032,0.115,0.104,0.101,0.032,0.104,0.097,0.1,0.032,0.097,0.11,0.032,0.101,0.12,0.116,0.114,0.097,0.109,0.097,0.114,0.105,0.116,0.097,0.108,0.032,0.097,0.102,0.102,0.097,0.105,0.114,0.032,0.119,0.105,0.116,0.104,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.097,0.032,0.1,0.101,0.099,0.097,0.1,0.101,0.032,0.097,0.103,0.111,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.111,0.108,0.1,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.097,0.098,0.111,0.117,0.116,0.032,0.104,0.105,0.115,0.032,0.112,0.108,0.097,0.11,0.115,0.032,0.116,0.111,0.032,0.115,0.101,0.116,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.097,0.11,0.1,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.101,0.032,0.116,0.104,0.101,0.032,0.112,0.117,0.114,0.099,0.104,0.097,0.1,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.032,0.102,0.114,0.111,0.109,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.044,0.032,0.119,0.104,0.105,0.099,0.104,0.032,0.112,0.117,0.098,0.108,0.105,0.115,0.104,0.101,0.115,0.032,0.116,0.104,0.101,0.032,0.078,0.097,0.116,0.105,0.111,0.11,0.097,0.108,0.032,0.069,0.11,0.113,0.117,0.105,0.114,0.101,0.114,0.046,0.032,0.084,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.099,0.097,0.112,0.116,0.117,0.114,0.101,0.115,0.032,0.119,0.104,0.097,0.116,0.032,0.097,0.112,0.112,0.101,0.097,0.114,0.115,0.032,0.116,0.111,0.032,0.098,0.101,0.032,0.097,0.032,0.114,0.111,0.117,0.116,0.105,0.11,0.101,0.032,0.098,0.117,0.115,0.105,0.11,0.101,0.115,0.115,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.032,0.111,0.102,0.032,0.115,0.101,0.118,0.101,0.114,0.097,0.108,0.032,0.109,0.097,0.116,0.116,0.101,0.114,0.115,0.032,0.111,0.11,0.032,0.116,0.104,0.101,0.105,0.114,0.032,0.097,0.103,0.101,0.11,0.1,0.097,0.046,0.032,0.084,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.116,0.104,0.101,0.032,0.109,0.101,0.097,0.11,0.105,0.11,0.103,0.032,0.111,0.102,0.032,0.084,0.114,0.117,0.109,0.112,0.039,0.115,0.032,0.117,0.115,0.101,0.032,0.111,0.102,0.032,0.116,0.104,0.101,0.032,0.119,0.111,0.114,0.1,0.032,0.034,0.034,0.099,0.097,0.102,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.105,0.115,0.032,0.1,0.105,0.115,0.112,0.117,0.116,0.101,0.1,0.032,0.098,0.121,0.032,0.116,0.104,0.101,0.032,0.116,0.119,0.111,0.032,0.115,0.105,0.1,0.101,0.115,0.046,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.073,0.032,0.11,0.101,0.101,0.1,0.032,0.116,0.111,0.032,0.111,0.112,0.101,0.11,0.032,0.117,0.112,0.032,0.097,0.032,0.099,0.111,0.109,0.112,0.097,0.11,0.121,0.032,0.102,0.111,0.114,0.032,0.116,0.104,0.101,0.032,0.116,0.114,0.097,0.11,0.115,0.102,0.101,0.114,0.032,0.111,0.102,0.032,0.097,0.108,0.108,0.032,0.111,0.102,0.032,0.116,0.104,0.097,0.116,0.032,0.105,0.11,0.102,0.111,0.032,0.114,0.101,0.103,0.097,0.114,0.1,0.105,0.11,0.103,0.032,0.111,0.117,0.114,0.032,0.102,0.114,0.105,0.101,0.11,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.105,0.1,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.044,0.032,0.108,0.105,0.107,0.101,0.108,0.121,0.032,0.097,0.032,0.114,0.101,0.102,0.101,0.114,0.101,0.11,0.099,0.101,0.032,0.116,0.111,0.032,0.065,0.109,0.101,0.114,0.105,0.099,0.097,0.11,0.032,0.077,0.101,0.1,0.105,0.097,0.032,0.104,0.101,0.097,0.1,0.032,0.068,0.097,0.118,0.105,0.1,0.032,0.1,0.08,0.101,0.1,0.099,0.107,0.101,0.114,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.032,0.099,0.111,0.109,0.101,0.115,0.032,0.117,0.112,0.032,0.097,0.103,0.097,0.105,0.11,0.032,0.108,0.097,0.116,0.101,0.114,0.032,0.105,0.11,0.032,0.116,0.104,0.101,0.032,0.099,0.111,0.11,0.118,0.101,0.114,0.115,0.097,0.116,0.105,0.111,0.11,0.044,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.11,0.116,0.101,0.114,0.114,0.117,0.112,0.116,0.115,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.097,0.115,0.107,0.105,0.11,0.103,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.104,0.097,0.116,0.032,0.102,0.105,0.11,0.097,0.11,0.099,0.105,0.11,0.103,0.063,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.097,0.099,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.105,0.11,0.103,0.046,0.032,0.087,0.104,0.101,0.11,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.116,0.101,0.108,0.108,0.115,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.087,0.101,0.038,0.035,0.051,0.057,0.059,0.108,0.108,0.032,0.104,0.097,0.118,0.101,0.032,0.116,0.111,0.032,0.112,0.097,0.121,0.044,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.084,0.114,0.117,0.109,0.112,0.032,0.105,0.115,0.032,0.104,0.101,0.097,0.114,0.1,0.032,0.115,0.097,0.121,0.105,0.11,0.103,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.112,0.097,0.121,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.1,0.097,0.115,0.104,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.116,0.104,0.101,0.032,0.097,0.117,0.1,0.105,0.111,0.032,0.105,0.115,0.032,0.109,0.117,0.1,0.1,0.108,0.101,0.1,0.032,0.097,0.11,0.1,0.032,0.105,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.117,0.11,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.101,0.116,0.104,0.101,0.114,0.032,0.104,0.101,0.032,0.115,0.117,0.103,0.103,0.101,0.115,0.116,0.115,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.032,0.119,0.105,0.116,0.104,0.032,0.099,0.097,0.102,0.115,0.104,0.032,0.111,0.114,0.032,0.11,0.111,0.116,0.032,0.112,0.097,0.121,0.105,0.11,0.103,0.046,0.032,0.067,0.111,0.104,0.101,0.11,0.032,0.115,0.097,0.121,0.115,0.044,0.032,0.038,0.113,0.117,0.111,0.116,0.059,0.11,0.111,0.044,0.032,0.11,0.111,0.038,0.113,0.117,0.111,0.116,0.059,0.032,0.098,0.117,0.116,0.032,0.105,0.116,0.032,0.105,0.115,0.032,0.11,0.111,0.116,0.032,0.099,0.108,0.101,0.097,0.114,0.032,0.119,0.104,0.097,0.116,0.032,0.105,0.115,0.032,0.115,0.097,0.105,0.1,0.032,0.11,0.101,0.12,0.116,0.046,0.032,0.078,0.111,0.032,0.112,0.097,0.121,0.109,0.101,0.11,0.116,0.032,0.119,0.097,0.115,0.032,0.101,0.118,0.101,0.114,0.032,0.109,0.097,0.1,0.101,0.032,0.102,0.114,0.111,0.109,0.032,0.084,0.114,0.117,0.109,0.112,0.044,0.032,0.082,0.117,0.1,0.121,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.044,0.032,0.116,0.104,0.101,0.032,0.08,0.114,0.101,0.115,0.105,0.1,0.101,0.11,0.116,0.038,0.035,0.051,0.057,0.059,0.115,0.032,0.097,0.116,0.116,0.111,0.114,0.11,0.101,0.121,0.044,0.032,0.104,0.097,0.115,0.032,0.115,0.097,0.105,0.1,0.046,0.032,0.071,0.105,0.117,0.108,0.105,0.097,0.11,0.105,0.032,0.104,0.097,0.115,0.032,0.112,0.114,0.101,0.118,0.105,0.111,0.117,0.115,0.108,0.121,0.032,0.097,0.099,0.107,0.11,0.111,0.119,0.108,0.101,0.1,0.103,0.101,0.1,0.032,0.116,0.104,0.097,0.116,0.032,0.116,0.104,0.101,0.032,0.114,0.101,0.099,0.111,0.114,0.1,0.101,0.1,0.032,0.1,0.105,0.115,0.099,0.117,0.115,0.115,0.105,0.111,0.11,0.032,0.114,0.101,0.108,0.097,0.116,0.101,0.1,0.032,0.116,0.111,0.032,0.116,0.104,0.101,0.032,0.098,0.117,0.1,0.102,0.121,0.105,0.11,0.103,0.032,0.116,0.104,0.101,0.032,0.115,0.116,0.111,0.114,0.121,0.032,0.114,0.105,0.103,0.104,0.116,0.115,0.046,0.01,0.01,0.01,0.01,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.047,0.092,0.095,0.047,0.095,0.01,0.01,0.01,0.01'.split(',').map(v => v * 1);
const trainingData = [];
for (let i = 0; i < 1000; i++) {
  trainingData.push({
    input: shuffleArray(trainingInputSample),
    output: { [Math.random() > 0.51 ? 'spam' : 'legit']: [1] }
    // output: Math.random() > 0.51 ? 'spam' : 'legit'
  })
}
const net = new NeuralNetwork();
const result = net.train(trainingData, { iterations: 10, log: true, callbackPeriod: 1 });
console.log(net.run(shuffleArray(trainingInputSample)));

function shuffleArray(array) {
  array = array.slice(0);
  for (let i = array.length - 1; i > 0; i--) {
    const j = Math.floor(Math.random() * (i + 1));
    const temp = array[i];
    array[i] = array[j];
    array[j] = temp;
  }
  return array;
}

@FinlayDaG33k
Copy link
Contributor Author

I currently use this piece of code for creating a vector based on this pseudo-code.
I do, however, have no real clue how feature hashing works and how to implement it yet but I'm working my way through this guide.
Feel free to help out on this!

// Define some variables
let vectorDimensions = 5
let vocabulary = ['the', 'quick', 'brown', 'fox'];

// Create a vector
vector = new Array(vectorDimensions).fill(0);

// Loop over each word in our list
await asyncForEach(vocabulary, async (feature) => {
  let h = fhash(crypto.createHash('sha512').update(feature).digest('hex'));
  let idx = h % vectorDimensions;
  vector[idx] += 1
});

note that asyncForEach is a function I wrote myself because I'm a PHP dev and like to keep things synchronous (sorry if this pisses you off <3)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants