Skip to content
Snippets Groups Projects
Commit 8fbff648 authored by Mateusz Klimaszewski's avatar Mateusz Klimaszewski
Browse files

Set default projection dim in transformer embedder.

parent 72a94d4b
No related branches found
No related tags found
2 merge requests!13Refactor merge develop to master,!12Refactor
...@@ -107,7 +107,7 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm ...@@ -107,7 +107,7 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm
def __init__(self, def __init__(self,
model_name: str, model_name: str,
projection_dim: int, projection_dim: int = 0,
projection_activation: Optional[allen_nn.Activation] = lambda x: x, projection_activation: Optional[allen_nn.Activation] = lambda x: x,
projection_dropout_rate: Optional[float] = 0.0, projection_dropout_rate: Optional[float] = 0.0,
freeze_transformer: bool = True, freeze_transformer: bool = True,
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment