Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix typos #11895

Merged
merged 1 commit into from Jul 31, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion tensorflow/contrib/seq2seq/python/ops/attention_wrapper.py
Expand Up @@ -299,7 +299,7 @@ def _luong_score(query, keys, scale):
# [batch_size, 1, depth] . [batch_size, depth, max_time]
# resulting in an output shape of:
# [batch_time, 1, max_time].
# we then squeee out the center singleton dimension.
# we then squeeze out the center singleton dimension.
score = math_ops.matmul(query, keys, transpose_b=True)
score = array_ops.squeeze(score, [1])

Expand Down
24 changes: 12 additions & 12 deletions tensorflow/python/debug/lib/debug_data.py
Expand Up @@ -1397,7 +1397,7 @@ def node_attributes(self, node_name, device_name=None):
Args:
node_name: Name of the node in question.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
Attributes of the node.
Expand All @@ -1419,7 +1419,7 @@ def node_inputs(self, node_name, is_control=False, device_name=None):
is_control: (`bool`) Whether control inputs, rather than non-control
inputs, are to be returned.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
(`list` of `str`) inputs to the node, as a list of node names.
Expand Down Expand Up @@ -1455,7 +1455,7 @@ def transitive_inputs(self,
the source (e.g., A in this case). So the reverse direction of the ref
edge reflects the direction of information flow.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
(`list` of `str`) all transitive inputs to the node, as a list of node
Expand Down Expand Up @@ -1524,7 +1524,7 @@ def find_some_path(self,
the source (e.g., A in this case). So the reverse direction of the ref
edge reflects the direction of information flow.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
A path from the src_node_name to dst_node_name, as a `list` of `str`, if
Expand Down Expand Up @@ -1581,7 +1581,7 @@ def node_recipients(self, node_name, is_control=False, device_name=None):
is_control: (`bool`) whether control outputs, rather than non-control
outputs, are to be returned.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
(`list` of `str`) all inputs to the node, as a list of node names.
Expand Down Expand Up @@ -1675,7 +1675,7 @@ def node_op_type(self, node_name, device_name=None):
Args:
node_name: (`str`) name of the node.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
(`str`) op type of the node.
Expand All @@ -1698,7 +1698,7 @@ def debug_watch_keys(self, node_name, device_name=None):
Args:
node_name: (`str`) name of the node.
device_name: (`str`) name of the device. If there is only one device or if
node_name exists on only one device, this argumnet is optional.
node_name exists on only one device, this argument is optional.

Returns:
(`list` of `str`) all debug tensor watch keys. Returns an empty list if
Expand Down Expand Up @@ -1732,7 +1732,7 @@ def watch_key_to_data(self, debug_watch_key, device_name=None):
Args:
debug_watch_key: (`str`) debug watch key.
device_name: (`str`) name of the device. If there is only one device or if
the specified debug_watch_key exists on only one device, this argumnet
the specified debug_watch_key exists on only one device, this argument
is optional.

Returns:
Expand Down Expand Up @@ -1813,7 +1813,7 @@ def get_tensor_file_paths(self,
output_slot: (`int`) output slot index of tensor.
debug_op: (`str`) name of the debug op.
device_name: (`str`) name of the device. If there is only one device or if
the specified debug_watch_key exists on only one device, this argumnet
the specified debug_watch_key exists on only one device, this argument
is optional.

Returns:
Expand Down Expand Up @@ -1846,7 +1846,7 @@ def get_tensors(self, node_name, output_slot, debug_op, device_name=None):
output_slot: (`int`) output slot index of tensor.
debug_op: (`str`) name of the debug op.
device_name: (`str`) name of the device. If there is only one device or if
the specified debug_watch_key exists on only one device, this argumnet
the specified debug_watch_key exists on only one device, this argument
is optional.

Returns:
Expand Down Expand Up @@ -1884,7 +1884,7 @@ def get_rel_timestamps(self,
output_slot: (`int`) output slot index of tensor.
debug_op: (`str`) name of the debug op.
device_name: (`str`) name of the device. If there is only one device or if
the specified debug_watch_key exists on only one device, this argumnet
the specified debug_watch_key exists on only one device, this argument
is optional.

Returns:
Expand Down Expand Up @@ -1918,7 +1918,7 @@ def get_dump_sizes_bytes(self,
output_slot: (`int`) output slot index of tensor.
debug_op: (`str`) name of the debug op.
device_name: (`str`) name of the device. If there is only one device or if
the specified debug_watch_key exists on only one device, this argumnet
the specified debug_watch_key exists on only one device, this argument
is optional.

Returns:
Expand Down
2 changes: 1 addition & 1 deletion tensorflow/stream_executor/lib/demangle.cc
Expand Up @@ -41,7 +41,7 @@ string Demangle(const char *mangled) {
#if HAS_CXA_DEMANGLE
result = abi::__cxa_demangle(mangled, nullptr, nullptr, &status);
#endif
if (status == 0 && result != nullptr) { // Demangling succeeeded.
if (status == 0 && result != nullptr) { // Demangling succeeded.
demangled.append(result);
free(result);
}
Expand Down
2 changes: 1 addition & 1 deletion tensorflow/tools/api/lib/python_object_to_proto_visitor.py
Expand Up @@ -13,7 +13,7 @@
# limitations under the License.
#
# ==============================================================================
"""A visitor class that generates protobufs for each pyton object."""
"""A visitor class that generates protobufs for each python object."""

from __future__ import absolute_import
from __future__ import division
Expand Down