Skip to content
Swift for TensorFlow Deep Learning Library
Branch: master
Clone or download
dave-fernandes and rxwei Add–1d–layers (#57)
* Added Conv1d, MaxPool1D and AvgPool1D layers

Co-Authored-By: dave-fernandes <davef@mintleafsoftware.com>
Latest commit d87fab3 Mar 23, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Sources/DeepLearning Add–1d–layers (#57) Mar 23, 2019
Tests Add–1d–layers (#57) Mar 23, 2019
.gitignore
CODE_OF_CONDUCT.md
CONTRIBUTING.md Add code of conduct and contribution guide. Refresh readme. Jan 22, 2019
Dockerfile
LICENSE
Package.swift Remove MNIST test. Feb 12, 2019
README.md Remove explicit differentiation parameters. They are no longer requir… Feb 26, 2019

README.md

Swift for TensorFlow Deep Learning Library

Get a taste of protocol-oriented differentiable programming.

This repository hosts Swift for TensorFlow's deep learning library, available both as a part of Swift for TensorFlow toolchains and as a Swift package.

Usage

This library is being automatically integrated in Swift for TensorFlow toolchains. You do not need to add this library as a Swift Package Manager dependency.

Use Google Colaboratory

Open an empty Colaboratory now to try out Swift, TensorFlow, differentiable programming, and deep learning.

For detailed usage and troubleshooting, see Usage on the Swift for TensorFlow project homepage.

Define a model

Simply import TensorFlow to get the full power of TensorFlow.

import TensorFlow

let hiddenSize: Int = 10

struct Model: Layer {
    var layer1 = Dense<Float>(inputSize: 4, outputSize: hiddenSize, activation: relu)
    var layer2 = Dense<Float>(inputSize: hiddenSize, outputSize: hiddenSize, activation: relu)
    var layer3 = Dense<Float>(inputSize: hiddenSize, outputSize: 3, activation: identity)
    
    @differentiable
    func applied(to input: Tensor<Float>, in context: Context) -> Tensor<Float> {
        return input.sequenced(in: context, through: layer1, layer2, layer3)
    }
}

Initialize a model and an optimizer

let optimizer = SGD<Model, Float>(learningRate: 0.02)
var classifier = Model()
let context = Context(learningPhase: .training)
let x: Tensor<Float> = ...
let y: Tensor<Float> = ...

Run a training loop

One way to define a training epoch is to use the Differentiable.gradient(in:) method.

for _ in 0..<1000 {
    let 𝛁model = classifier.gradient { classifier -> Tensor<Float> in
        let ŷ = classifier.applied(to: x, in: context)
        let loss = softmaxCrossEntropy(logits: ŷ, labels: y)
        print("Loss: \(loss)")
        return loss
    }
    optimizer.update(&classifier.allDifferentiableVariables, along: 𝛁model)
}

Another way is to make use of methods on Differentiable or Layer that produce a backpropagation function. This allows you to compose your derivative computation with great flexibility.

for _ in 0..<1000 {
    let (ŷ, backprop) = classifier.appliedForBackpropagation(to: x, in: context)
    let (loss, 𝛁ŷ) = ŷ.valueWithGradient { ŷ in softmaxCrossEntropy(logits: ŷ, labels: y) }
    print("Model output: \(ŷ), Loss: \(loss)")
    let 𝛁model = backprop(𝛁ŷ)
    optimizer.update(&classifier.allDifferentiableVariables, along: 𝛁model)
}

For more tutorials and models, go to tensorflow/swift-tutorials and tensorflow/swift-models.

Development

Requirements

Building and testing

$ swift build
$ swift test

Bugs

Please report bugs and feature requests using GitHub issues in this repository.

Community

Discussion about Swift for TensorFlow happens on the swift@tensorflow.org mailing list.

Contributing

We welcome contributions: please read the Contributor Guide to get started. It's always a good idea to discuss your plans on the mailing list before making any major submissions.

Code of Conduct

In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.

The Swift for TensorFlow community is guided by our Code of Conduct, which we encourage everybody to read before participating.

You can’t perform that action at this time.