-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Closed
Description
What is wrong?
Hidden layers of differing sizes cause the RNN net to throw an exception.
Where does it happen?
Internally.
How do we replicate the issue?
This doesn't work:
const brain = require('./dist/index').default;
const inputSize = 2;
const data = [];
for(let i=0; i < 5; i++) {
const e = (new Array(inputSize).fill(0)).map(x => Math.random());
data.push(e);
}
const networkOptions = {
learningRate: 0.001,
decayRate: 0.75,
inputSize: inputSize,
hiddenLayers: [5,4,3],
outputSize: inputSize
};
const net = new brain.recurrent.LSTMTimeStep(networkOptions);
console.log('starting');
net.train([data], { log: true, logPeriod: 1 });
console.log('all done');How important is this (1-5)?
5
Expected behavior (i.e. solution)
Other Comments
Metadata
Metadata
Assignees
Labels
No labels
