Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ES] Spanish translation week 12, section 2 #570

Merged
merged 19 commits into from
Sep 19, 2020
Merged

[ES] Spanish translation week 12, section 2 #570

merged 19 commits into from
Sep 19, 2020

Conversation

GastonMazzei
Copy link
Contributor

Let's keep an eye on the associated videos
"beam search" --> here "búsqueda por haces" [https://www.youtube.com/watch?v=6D4EWKJgNn0&t=2732s]
"traducción inversa" [https://www.youtube.com/watch?v=6D4EWKJgNn0&t=3811s]
"entrenamiento previo para NLP" [https://www.youtube.com/watch?v=6D4EWKJgNn0&t=4963s]

GastonMazzei and others added 11 commits September 17, 2020 11:39
Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
infinitive to gerund

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
infinitive to gerund

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
infinitive to gerund

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
infinitive to gerund

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
upper case

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
infinitive to gerund

Co-authored-by: Willebaldo Gómez <willebaldo.gomez@gmail.com>
Copy link
Owner

@Atcold Atcold left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to edit docs/_config.yml as well.

docs/es/week12/12-2.md Show resolved Hide resolved
docs/es/week12/12-2.md Show resolved Hide resolved
docs/es/week12/12-2.md Outdated Show resolved Hide resolved
@GastonMazzei
Copy link
Contributor Author

You need to edit docs/_config.yml as well.

O.K. JUST DID... even tho "12.md" doesn't exist yet!

If merging conflicts let me know and I'll change the "/week12/12.md" to "../week12"

Screenshot from 2020-09-18 01-52-11

@@ -295,6 +295,9 @@ es:
sections:
- path: es/week11/11-1.md
- path: es/week11/11-2.md
- path: es/week12/12.md
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My PR that's almost ready to get merged includes this, if you're concerned: https://github.com/Atcold/pytorch-Deep-Learning/pull/555/files

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK so I will add mine using the same header, fine?
i.e. this will be present

-path: es/week12/12.md
  sections:
    - path: es/week12/12-2.md

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My question is if it wont generate a duplicate; sorry for the trouble just learning github lol

Copy link
Contributor

@zxul767 zxul767 Sep 18, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it shouldn't create a merge conflict because it's exactly the same, but even if it does, resolving it should be trivial.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And how should it be resolved then?

(A) by me adding yours - path: es/week12/12-3.md
(B) by me removing all except for - path: es/week12/12-2.md

Am i being clear with my question? i.e. what do you want me to do about it?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typically, when your PR happens to be merged first, option (A) would apply. If my PR were to be merged first, then you'd just need to be sure that nothing got duplicated.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh ok, will leave as is and keep an eye on what happens next. Thanks!

Copy link
Contributor

@zxul767 zxul767 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I learned a lot by reviewing this translation, NLP is such an interesting topic!

I left several minor comments, but overall the translation looks great!


<!-- The algorithm selects the best scoring hypothesis.-->

El algoritmo selecciona la mejor hipótesis de puntuación.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest "...selecciona la hipótesis con mejor puntuación."?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha yes it is (an interesting topic). Great bug-detection skills! And great suggestions!
All were implemented


<!-- The beam tree continues until it reaches the end of sentence token. Upon outputting the end of sentence token, the hypothesis is finished.-->

El árbol de haces continúa hasta que llega al final del token de oración. Al generar el final del token de oración, la hipótesis está terminada.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest "...hasta que llega al token de final de oración."? In this case, "end of sentence" is acting as an adjective for "token"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed! thanks


<!-- Why (in NMT) do very large beam sizes often results in empty translations?-->

¿Por qué (en NMT) los tamaños de haces muy grandes a menudo dan como resultado traslaciones vacías?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the more appropriate translation for "translations" is "traducciones" in this context

docs/_config.yml Outdated
@@ -422,6 +429,9 @@ tr:
ja:
title: '深層学習'
chapters:
- path: ja/week03/03.md
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was this needed for the site to compile locally?

<!-- A pure sampling technique where you truncate the distribution to the $k$ best and then renormalise and sample from the distribution.-->


Una técnica de muestreo pura en la que se trunca la distribución al $ k $ best y luego se vuelve a normalizar y se muestra la distribución.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe "to the K best" here refers to "the K best samples", so the translation "a los K mejores" seems more appropriate.


- ¡Muchos objetivos diferentes de preentrenamiento funcionan bien!

- Crucial para modelar interacciones profundas y bidireccionales entre palabras
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest replacing "Crucial para modelar..." with "Es muy importante modelar..."?


### Algunas preguntas abiertas en NLP

- ¿Cómo deberíamos integrar el conocimiento mundial?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the previous PR, I suggested using "conocimiento del mundo real" instead of "conocimiento mundial", as that's what I've seen most commonly used in text for LATAM, but I'm not sure if the same applies in other Spanish-speaking regions of the world (particularly Spain)


<!--- Models can learn a lot about language by predicting words in unlabelled text. This turns out to be a great unsupervised learning objective. Fine tuning for specific tasks is then easy-->

- Los modelos pueden aprender mucho sobre el lenguaje al predecir palabras en texto sin etiquetar. Esto resulta ser un gran objetivo de aprendizaje sin supervisión. La puesta a punto para tareas específicas es fácil
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Los modelos pueden aprender mucho sobre el lenguaje al predecir palabras en texto sin etiquetar. Esto resulta ser un gran objetivo de aprendizaje sin supervisión. La puesta a punto para tareas específicas es fácil
- Los modelos pueden aprender mucho sobre el lenguaje al predecir palabras en texto sin etiquetar. Esto resulta ser un gran objetivo de aprendizaje sin supervisión. La puesta a punto para tareas específicas se vuelve entonces fácil


<!--Interestingly, the lecturer (Mike Lewis, Research Scientist, FAIR) is working on a concept called ‘Grounded Language’. The aim of that field of research is to build conversational agents that are able to chit-chat or negotiate. Chit-chatting and negotiating are abstract tasks with unclear objectives as compared to text classification or text summarization.-->

Curiosamente, el profesor (Mike Lewis, investigador científico, FAIR) está trabajando en un concepto llamado "Lenguaje fundamentado". El objetivo de ese campo de investigación es construir agentes conversacionales que sean capaces de charlar o negociar. Charlar y negociar son tareas abstractas con objetivos poco claros en comparación con la clasificación o el resumen del texto.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Curiosamente, el profesor (Mike Lewis, investigador científico, FAIR) está trabajando en un concepto llamado "Lenguaje fundamentado". El objetivo de ese campo de investigación es construir agentes conversacionales que sean capaces de charlar o negociar. Charlar y negociar son tareas abstractas con objetivos poco claros en comparación con la clasificación o el resumen del texto.
Curiosamente, el profesor (Mike Lewis, investigador científico, FAIR) está trabajando en un concepto llamado "Lenguaje Fundamentado". El objetivo de ese campo de investigación es construir agentes conversacionales que sean capaces de charlar o negociar. Charlar y negociar son tareas abstractas con objetivos poco claros en comparación con la clasificación o el resumen del texto.


<!--‘World Knowledge’ is an abstract concept. We can test models, at the very basic level, for their world knowledge by asking them simple questions about the concepts we are interested in. Models like BERT, RoBERTa and T5 have billions of parameters. Considering these models are trained on a huge corpus of informational text like Wikipedia, they would have memorized facts using their parameters and would be able to answer our questions. Additionally, we can also think of conducting the same knowledge test before and after fine-tuning a model on some task. This would give us a sense of how much information the model has ‘forgotten’.-->

El "conocimiento mundial" es un concepto abstracto. Podemos probar modelos, en un nivel muy básico, para su conocimiento del mundo haciéndoles preguntas simples sobre los conceptos que nos interesan. Modelos como BERT, RoBERTa y T5 tienen miles de millones de parámetros. Teniendo en cuenta que estos modelos están entrenados en un enorme corpus de texto informativo como Wikipedia, habrían memorizado hechos usando sus parámetros y podrían responder nuestras preguntas. Además, también podemos pensar en realizar la misma prueba de conocimientos antes y después de ajustar un modelo en alguna tarea. Esto nos daría una idea de cuánta información ha "olvidado" el modelo.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest rephrasing "Podemos probar modelos... para su conocimiento del mundo..." as "Podemos probar el conocimiento del mundo que tienen los modelos (en un nivel muy básico)..."?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great suggestions overall, and also great bug-detecting! implemented all of them, thanx!

Copy link
Owner

@Atcold Atcold left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sweet, thanks!

@Atcold Atcold merged commit 573f7d5 into Atcold:master Sep 19, 2020
@Atcold
Copy link
Owner

Atcold commented Sep 23, 2020

Hmm, I got a notification from the PR.
Anything that I should pay attention to?

@GastonMazzei
Copy link
Contributor Author

Hmm, I got a notification from the PR.
Anything that I should pay attention to?

You DID merge it and no changes happened since, so I'm as lost as you are.
Any more I can do to help?

@Atcold
Copy link
Owner

Atcold commented Sep 26, 2020

Hmm…

Yes, there are plenty of chapters to review! Check out the wiki!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants