Trying to dynamically change weighted connections between training examples #40576
Labels
comp:keras
Keras related issues
stat:awaiting tensorflower
Status - Awaiting response from tensorflower
type:feature
Feature requests
Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template
System information
Describe the feature and the current behavior/state.
I would like to be able to change the routing of the weighted connections between training examples. I have developed the theory for a type of neural network that takes in as input a random vector of ints between 0 and 10 (and maps it to an image). then takes these ints and after adding 200 multiples of 10 for each dimension, i use this result as the index in tf.gather with the default weight matrix which is of shape (10,200). The reason we add multiples of 200 is because the first hidden layer will have 200 neurons and we want each possible number in the input to come with its own weighted connections. So between examples we will be loading different weighted connections into the weight matrix of the layer dependant on the input.
if two dimesnions happen to have the same numebr they will both load in the same connections in their respective dimensions. On top of this, to make each entry in the dimension unique we employ a circulant routing where the first neuron enters its entries in root position, while the next one rotates them by one, then the next one rotates them by 2 and so on. So a 2 in the nth position will have a different effect on the next layer from a 2 in the nth +1 position.
Additionally, Once the input layer is arranged dynamically we mirror the form within other layers that inherit from the input layer. So if there is a 3 in the nth position of the first layer. The next layer will have "3" weighted connections in its nth position. Note that these are not the same weighted connections that the first layer had , but a corresponding vector of weighted connections that go with it , but for the next layer. (we could not use the input into that layer to order its connections because tf._gather requires indices that are ints).
i would also like to somehow do this for convolutional layers too if the logic permits.
Here is some code that attempted to do all of this and a link to a colab file.
Will this change the current api? How?
I am not sure if it will because i have been trying to do it with no success
Who will benefit with this feature?
this is a new generative algorithm that will benefit the AI community in general.
Any Other info.
https://colab.research.google.com/drive/1mWAQH1jJMFJ0NTKrqLxzet47aUXm9wWz?usp=sharing
The text was updated successfully, but these errors were encountered: