Skip to content
Snippets Groups Projects
Commit 8fbff648 authored by Mateusz Klimaszewski's avatar Mateusz Klimaszewski
Browse files

Set default projection dim in transformer embedder.

parent 72a94d4b
Branches
Tags
2 merge requests!13Refactor merge develop to master,!12Refactor
......@@ -107,7 +107,7 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm
def __init__(self,
model_name: str,
projection_dim: int,
projection_dim: int = 0,
projection_activation: Optional[allen_nn.Activation] = lambda x: x,
projection_dropout_rate: Optional[float] = 0.0,
freeze_transformer: bool = True,
......
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment