🐞 Issue: Implement Tensor Flattening Operation
Description:
To support MLP (Multi-Layer Perceptron) models for image classification tasks like MNIST, we need a flattening operation that reshapes a multi-dimensional tensor (e.g., 1×28×28) into a 1D tensor (e.g., 784). This is a necessary preprocessing step before feeding data into fully connected (dense) layers.
Expected Functionality:
- Input tensor shape:
(batchSize, channels, height, width)
- Output tensor shape:
(batchSize, channels * height * width)
- Should support batch processing
- Integrate cleanly into existing model building API
Use Case:
model {
flatten() // Converts input shape [1, 28, 28] → [784]
dense(units = 128, activation = ReLU)
dense(units = 10, activation = Softmax)
}
References:
- PyTorch:
nn.Flatten()
- TensorFlow:
tf.keras.layers.Flatten()
Priority: Medium
Difficulty: Easy
Labels: feature, tensor, core-api
Let me know if you're using a specific format for issues or if you'd like to auto-generate code scaffolding for this operation.
🐞 Issue: Implement Tensor Flattening Operation
Description:
To support MLP (Multi-Layer Perceptron) models for image classification tasks like MNIST, we need a flattening operation that reshapes a multi-dimensional tensor (e.g., 1×28×28) into a 1D tensor (e.g., 784). This is a necessary preprocessing step before feeding data into fully connected (dense) layers.
Expected Functionality:
(batchSize, channels, height, width)(batchSize, channels * height * width)Use Case:
model { flatten() // Converts input shape [1, 28, 28] → [784] dense(units = 128, activation = ReLU) dense(units = 10, activation = Softmax) }References:
nn.Flatten()tf.keras.layers.Flatten()Priority: Medium
Difficulty: Easy
Labels:
feature,tensor,core-apiLet me know if you're using a specific format for issues or if you'd like to auto-generate code scaffolding for this operation.