Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Turkish translations for eager notebooks. #604

Merged
merged 6 commits into from Jun 10, 2019
Merged
Changes from 1 commit
Commits
File filter...
Filter file types
Jump to…
Jump to file or symbol
Failed to load files and symbols.

Always

Just for now

custom training walkthrough translation

  • Loading branch information...
ilkaynazli committed May 19, 2019
commit 7ded27c0559933eda1c9b769fb46bead4b931a61
@@ -41,7 +41,7 @@
"id": "xh8WkEwWpnm7"
},
"source": [
"# Automatic differentiation and gradient tape"
"# Otomatik degisim ve egim banti"
]
},
{
@@ -71,7 +71,7 @@
"id": "vDJ4XzMqodTy"
},
"source": [
"In the previous tutorial we introduced `Tensor`s and operations on them. In this tutorial we will cover [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation), a key technique for optimizing machine learning models."
"Bir onceki egitim kitapciginda 'Tensor'lari ve onlar ustunde kullanabileceginiz operasyonlari tanittik. Bu kitapcikta ise makine ogrenmesi modellerinin eniyilenmesinde onemli bir teknik olan [otomatik degisimi](https://en.wikipedia.org/wiki/Automatic_differentiation) ogrenecegiz."
]
},
{
@@ -81,7 +81,7 @@
"id": "GQJysDM__Qb0"
},
"source": [
"## Setup\n"
"## Kurulum\n"
]
},
{
@@ -108,11 +108,11 @@
"id": "1CLWJl0QliB0"
},
"source": [
"## Gradient tapes\n",
"## Egim bantlari\n",
"\n",
"TensorFlow provides the [tf.GradientTape](https://www.tensorflow.org/api_docs/python/tf/GradientTape) API for automatic differentiation - computing the gradient of a computation with respect to its input variables. Tensorflow \"records\" all operations executed inside the context of a `tf.GradientTape` onto a \"tape\". Tensorflow then uses that tape and the gradients associated with each recorded operation to compute the gradients of a \"recorded\" computation using [reverse mode differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation).\n",
"TensorFlow'un [tf.GradientTape](https://www.tensorflow.org/api_docs/python/tf/GradientTape) API'si otomatik degisim yani girdi degiskenlerine bagli olarak hesaplanan egimin hesaplanisini hali hazirda bize saglar. Tensorflow `tf.GradientTape` kapsaminda yapilan butun operasyonlari bir \"tape(bant)\"e \"kaydeder\". Tensorflow daha sonra \"kaydedilmis\" egimleri, bu bant ve her bir kayitla iliskili egim verilerini [ters mod degisimi](https://en.wikipedia.org/wiki/Automatic_differentiation) kullanarak hesaplar.\n",
"\n",
"For example:"
"Ornegin:"
]
},
{
@@ -132,7 +132,7 @@
" y = tf.reduce_sum(x)\n",
" z = tf.multiply(y, y)\n",
"\n",
"# Derivative of z with respect to the original input tensor x\n",
"# Orjinal girdi tensoru x'e gore z'nin turevi\n",
"dz_dx = t.gradient(z, x)\n",
"for i in [0, 1]:\n",
" for j in [0, 1]:\n",
@@ -146,7 +146,7 @@
"id": "N4VlqKFzzGaC"
},
"source": [
"You can also request gradients of the output with respect to intermediate values computed during a \"recorded\" `tf.GradientTape` context."
"Ayrica \"kaydedilmis\" 'tf.GradientTape' kapsaminda hesaplanan ara degerlere gore ciktilari egimini de isteyebilirsiniz."
]
},
{
@@ -166,8 +166,7 @@
" y = tf.reduce_sum(x)\n",
" z = tf.multiply(y, y)\n",
"\n",
"# Use the tape to compute the derivative of z with respect to the\n",
"# intermediate value y.\n",
"# Banti kullanarak ara deger y'ye gore z'nin turevini hesaplayabiliriz.\n",
"dz_dy = t.gradient(z, y)\n",
"assert dz_dy.numpy() == 8.0"
]
@@ -179,7 +178,7 @@
"id": "ISkXuY7YzIcS"
},
"source": [
"By default, the resources held by a GradientTape are released as soon as GradientTape.gradient() method is called. To compute multiple gradients over the same computation, create a `persistent` gradient tape. This allows multiple calls to the `gradient()` method. as resources are released when the tape object is garbage collected. For example:"
"GradientTape.gradient() yontemini cagirdimizda GradientTape tarafindan tutulan kaynaklar serbest birakilir. Ayni degerleri kullanarak birden fazla egim hesaplamak istiyorsaniz 'persistent(kalici)' egim banti olusturmalisiniz. Bu sayede bant nesnesi cop toplayicisi tarafindan toplanip kaynaklar serbest birakildikca 'gradient()' yontemini bircok kere cagirmamiza izin verir. Ornegin:"
]
},
{
@@ -199,7 +198,7 @@
" z = y * y\n",
"dz_dx = t.gradient(z, x) # 108.0 (4*x^3 at x = 3)\n",
"dy_dx = t.gradient(y, x) # 6.0\n",
"del t # Drop the reference to the tape"
"del t # Referansi banta indirgeyelim"
]
},
{
@@ -209,9 +208,9 @@
"id": "6kADybtQzYj4"
},
"source": [
"### Recording control flow\n",
"### Kontrol akimini kaydedelim\n",
"\n",
"Because tapes record operations as they are executed, Python control flow (using `if`s and `while`s for example) is naturally handled:"
"Bantlar operasyonlar yurutuldukce kaydettigi icin, Python kontrol akimlari (`if`ler ve `while`lar gibi) dogal olarak islenir:"
]
},
{
@@ -251,9 +250,9 @@
"id": "DK05KXrAAld3"
},
"source": [
"### Higher-order gradients\n",
"### Yuksek-sirali egimler\n",
"\n",
"Operations inside of the `GradientTape` context manager are recorded for automatic differentiation. If gradients are computed in that context, then the gradient computation is recorded as well. As a result, the exact same API works for higher-order gradients as well. For example:"
"`GradientTape` kapsam yoneticisindeki operasyonlar otomatik degisim icin kaydedilir. Eger egimler bu kapsamda hesaplandiysa onlar da ayni sekilde kaydedilir. Sonuc olarak, ayni API'yi kullanarak yuksek-sirali egimleri hesaplayabiliriz. Ornegin:"
]
},
{
@@ -266,13 +265,13 @@
},
"outputs": [],
"source": [
"x = tf.Variable(1.0) # Create a Tensorflow variable initialized to 1.0\n",
"x = tf.Variable(1.0) # 1.0 degerine ilklenmis bir Tensorflow degiskeni olusturalim\n",
"\n",
"with tf.GradientTape() as t:\n",
" with tf.GradientTape() as t2:\n",
" y = x * x * x\n",
" # Compute the gradient inside the 't' context manager\n",
" # which means the gradient computation is differentiable as well.\n",
" # 't' kapsam yoneticisi icerisinde egimi hesaplayalim\n",
" # ki bu egim hesaplanmasinin turevlenebilir oldugu anlamina gelir.\n",
" dy_dx = t2.gradient(y, x)\n",
"d2y_dx2 = t.gradient(dy_dx, x)\n",
"\n",
@@ -287,9 +286,9 @@
"id": "4U1KKzUpNl58"
},
"source": [
"## Next Steps\n",
"## Bir sonraki adimlar\n",
"\n",
"In this tutorial we covered gradient computation in TensorFlow. With that we have enough of the primitives required to build and train neural networks."
"Bu kitapcikta egim hesaplanmasinin TensorFlow'da nasil yapildigini gorduk. Simdi sinir agimizi olusturmak ve egitmek icin gerekli ilkellerin hepsine sahibiz."
]
}
],
ProTip! Use n and p to navigate between commits in a pull request.
You can’t perform that action at this time.