Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JA: add site/ja/tutorials/text/text_generation.ipynb #1190

Conversation

@masa-ita
Copy link
Contributor

masa-ita commented Nov 15, 2019

No description provided.

masa-ita added 9 commits Oct 24, 2019
…ub.com/masa-ita/tf-docs into site_ja_tutorials_text_text_generation
@tfdocsbot

This comment has been minimized.

Copy link
Collaborator

tfdocsbot commented Nov 15, 2019

Preview and run these notebook edits with Google Colab:

Notebook diffs available on ReviewNB.com.
@tfdocsbot

This comment has been minimized.

Copy link
Collaborator

tfdocsbot commented Nov 15, 2019

Reviewers added, please take a look.
@ohtaman, @sfujiwara, @AseiSugiyama, @yukinagae, @nuka137, @chie8842, @kiszk

When your review is finished, approve the pull request or include "LGTM" in your comment.

@googlebot googlebot added the cla: yes label Nov 15, 2019
"id": "BwpJ5IffzRG6"
},
"source": [
"このチュートリアルでは、文字ベースの RNN を使ってテキストを生成する方法を示します。ここでは、Andrej Karpathy の [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) からのシェイクスピア作品のデータセットを使います。このデータからの文字列(\"Shakespear\")を入力にして、文字列中の次の文字(\"e\")を予測するモデルを訓練します。このモデルを繰り返し呼び出すことで、より長い文字列を生成することができます。\n",

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

[suggestion] Maybe the below translation seems to be better to me if I were to translate the sentence Given a sequence of characters from this data ("Shakespear").

このデータからの文字列(\"Shakespear\")を入力にして -> このデータ(\"Shakespear\")からの文字列を入力にして

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

I think the string "Shakespear" is a part of "Shakespeare" to feed the model and predict the last "e" .

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 19, 2019

Contributor

Oh, I got your point now! You're right :)

"were she such a case as fills m\n",
"</pre>\n",
"\n",
"いくつかは文法にあったものがある一方で、ほとんどは意味をなしていません。このモデルは、単語の意味を学習していませんが、次のことを考えてみてください。\n",

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

This paragraph is nice :)

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

Thanks!

"colab": {}
},
"source": [
"# 読み込んだのち、Python 2 との互換性のためにデコード\n",

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

This translation is pretty good!

  • Original: Read, then decode for py2 compat. <- unnecessasry abbreviation...

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

Thank you.

"id": "bbmsf23Bymwe"
},
"source": [
"### 推論タスク"

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

[weak suggestion] 推論タスク -> 予測タスク

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

Thank you. Fixed.

"BATCH_SIZE = 64\n",
"\n",
"# データセットをシャッフルするためのバッファサイズ\n",
"# (TD data は可能性として無限長のシーケンスでも使えるように設計されています。\n",

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

[typo] TD -> TF

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

Oops. Fixed.

"id": "IxdOA-rgyGvs"
},
"source": [
"訓練時間を適切に保つために、10エポックを使用してモデルを訓練します。"

This comment has been minimized.

Copy link
@yukinagae

yukinagae Nov 18, 2019

Contributor

[suggestion] missing one sentence?

In Colab, set the runtime to GPU for faster training.

This comment has been minimized.

Copy link
@masa-ita

masa-ita Nov 19, 2019

Author Contributor

Thanks. Added "Google Colab を使用する場合には、訓練を高速化するためにランタイムを GPU に設定します。".

Copy link
Contributor

yukinagae left a comment

LGTM! Great work 👍

Copy link
Member

lamberta left a comment

Thanks.
Would like to avoid multiple versions of the same images. Please remove the images from this PR (since the site falls back to the en/ versions).

For Colab, maybe we can just directly link to the site like: https://www.tensorflow.org/tutorials/text/images/text_generation_sampling.png
and
https://www.tensorflow.org/tutorials/text/images/text_generation_training.png

masa-ita added 2 commits Nov 21, 2019
TensorFlow-Docs-Copybara pushed a commit that referenced this pull request Nov 28, 2019
@TensorFlow-Docs-Copybara TensorFlow-Docs-Copybara merged commit 0961aa6 into tensorflow:master Nov 28, 2019
2 checks passed
2 checks passed
cla/google All necessary CLAs are signed
import/copybara Change imported to the internal review system
Details
@masa-ita masa-ita deleted the masa-ita:site_ja_tutorials_text_text_generation branch Nov 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
7 participants
You can’t perform that action at this time.