Skip to content
Snippets Groups Projects
Commit 17322ce9 authored by Martyna Wiącek's avatar Martyna Wiącek
Browse files

fix embedding_dim in default_model

parent 27ff4b0a
Branches
Tags
2 merge requests!49Multiword fix transformer encoder,!47Fixed multiword prediction + bug that made the code write empty predictions
This commit is part of merge request !47. Comments created here will be created in the context of that merge request.
......@@ -160,7 +160,7 @@ def default_model(pretrained_transformer_name: str, vocabulary: Vocabulary, use_
activations=[GELUActivation(), GELUActivation(), GELUActivation(), LinearActivation()],
char_vocab_namespace="token_characters",
dilation=[1, 2, 4, 1],
embedding_dim=256,
embedding_dim=300,
filters=[256, 256, 256],
input_projection_layer=Linear(
activation=TanhActivation(),
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment