Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
astorfi committed Oct 31, 2017
2 parents 2829d65 + 0d0930d commit f5d34f4
Show file tree
Hide file tree
Showing 5 changed files with 73 additions and 72 deletions.
6 changes: 3 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,11 @@ python:
install:
- pip install -r requirements.txt
- pip install coveralls
- pip install flake8

script:
# stop the build if there are Python syntax errors or undefined names
- flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics

- coverage run --omit=*.virtualenvs*,*virtualenv* codes/0-welcome/code/0-welcome.py test
- coverage run --omit=*.virtualenvs*,*virtualenv* codes/1-basics/basic_math_operations/code/basic_math_operation.py test
Expand All @@ -20,9 +23,6 @@ script:
- coverage run --omit=*.virtualenvs*,*virtualenv* codes/2-basics_in_machine_learning/logistic_regression/code/logistic_regression.py test
- coverage run --omit=*.virtualenvs*,*virtualenv* codes/2-basics_in_machine_learning/multiclass_svm/code/multiclass_svm.py test




after_success:
coveralls

Expand Down

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
import xlrd
import matplotlib.pyplot as plt
Expand All @@ -16,8 +15,7 @@
#######################
## Defining flags #####
#######################
tf.app.flags.DEFINE_integer(
'num_epochs', 5, 'The number of epochs for training the model. Default=50')
tf.app.flags.DEFINE_integer('num_epochs', 50, 'The number of epochs for training the model. Default=50')
# Store all elemnts in FLAG structure!
FLAGS = tf.app.flags.FLAGS

Expand Down Expand Up @@ -58,7 +56,7 @@ def loss(X, Y):

# Making the prediction.
Y_predicted = inference(X)
return tf.squared_difference(Y, Y_predicted)
return tf.reduce_sum(tf.squared_difference(Y, Y_predicted))/(2*data.shape[0])


# The training function.
Expand All @@ -81,11 +79,8 @@ def train(loss):

# Step 8: train the model
for epoch_num in range(FLAGS.num_epochs): # run 100 epochs
for x, y in data:
train_op = train(train_loss)

# Session runs train_op to minimize loss
loss_value,_ = sess.run([train_loss,train_op], feed_dict={X: x, Y: y})
loss_value, _ = sess.run([train_loss,train_op],
feed_dict={X: data[:,0], Y: data[:,1]})

# Displaying the loss per epoch.
print('epoch %d, loss=%f' %(epoch_num+1, loss_value))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@
test_summary_writer = tf.summary.FileWriter(test_summary_dir)
test_summary_writer.add_graph(sess.graph)

# If fie-tuning flag in 'True' the model will be restored.
# If fine-tuning flag in 'True' the model will be restored.
if FLAGS.fine_tuning:
saver.restore(sess, os.path.join(FLAGS.checkpoint_root, checkpoint_prefix))
print("Model restored for fine-tuning...")
Expand Down
24 changes: 1 addition & 23 deletions docs/tutorials/0-welcome/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,27 +80,5 @@ The ``session``, which is the environment for running the operations, is execute
sess.close()
The ``tf.summary.FileWriter`` is defined to write the summaries into ``event files``.The command of ``sess.run()`` must be used for evaluation of any ``Tensor`` otherwise the operation won't be executed. In the end by using the ``writer.close()``, the summary writer will be closed.

--------
Results
--------

The results for running in the terminal is as bellow:

.. code:: shell
a = 5.0
b = 10.0
a + b = 15.0
a/b = 0.5
If we run the Tensorboard using ``tensorboard --logdir="absolute/path/to/log_dir"`` we get the following when visualiaing the ``Graph``:

.. figure:: https://github.com/astorfi/TensorFlow-World/blob/master/docs/_img/0-welcome/graph-run.png
:scale: 30 %
:align: center

**Figure 1:** The TensorFlow Graph.


0 comments on commit f5d34f4

Please sign in to comment.