Skip to content
This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Commit 69e40fb

Browse files
authored
Merge pull request #198 from martinpopel/fix-encs
Fix encs
2 parents cd222d3 + 8839cf9 commit 69e40fb

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ t2t-datagen \
8989
--problem=$PROBLEM
9090
9191
# Train
92-
# * If you run out of memory, add --hparams='batch_size=2048' or even 1024.
92+
# * If you run out of memory, add --hparams='batch_size=1024'.
9393
t2t-trainer \
9494
--data_dir=$DATA_DIR \
9595
--problems=$PROBLEM \
@@ -166,7 +166,7 @@ python -c "from tensor2tensor.models.transformer import Transformer"
166166
with `Modality` objects, which are specified per-feature in the dataset/task
167167
specification.
168168
* Support for multi-GPU machines and synchronous (1 master, many workers) and
169-
asynchrounous (independent workers synchronizing through a parameter server)
169+
asynchronous (independent workers synchronizing through a parameter server)
170170
[distributed training](https://github.com/tensorflow/tensor2tensor/tree/master/docs/distributed_training.md).
171171
* Easily swap amongst datasets and models by command-line flag with the data
172172
generation script `t2t-datagen` and the training script `t2t-trainer`.

tensor2tensor/data_generators/wmt.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -574,7 +574,7 @@ class WMTEnCsTokens32k(WMTProblem):
574574
"""Problem spec for WMT English-Czech translation."""
575575

576576
@property
577-
def target_vocab_size(self):
577+
def targeted_vocab_size(self):
578578
return 2**15 # 32768
579579

580580
@property

0 commit comments

Comments
 (0)