Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
C
combo
Manage
Activity
Members
Labels
Plan
Issues
20
Issue boards
Milestones
Wiki
Redmine
Code
Merge requests
2
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Syntactic Tools
combo
Commits
8fbff648
Commit
8fbff648
authored
4 years ago
by
Mateusz Klimaszewski
Browse files
Options
Downloads
Patches
Plain Diff
Set default projection dim in transformer embedder.
parent
72a94d4b
Branches
Branches containing commit
Tags
Tags containing commit
2 merge requests
!13
Refactor merge develop to master
,
!12
Refactor
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
combo/models/embeddings.py
+1
-1
1 addition, 1 deletion
combo/models/embeddings.py
with
1 addition
and
1 deletion
combo/models/embeddings.py
+
1
−
1
View file @
8fbff648
...
...
@@ -107,7 +107,7 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm
def
__init__
(
self
,
model_name
:
str
,
projection_dim
:
int
,
projection_dim
:
int
=
0
,
projection_activation
:
Optional
[
allen_nn
.
Activation
]
=
lambda
x
:
x
,
projection_dropout_rate
:
Optional
[
float
]
=
0.0
,
freeze_transformer
:
bool
=
True
,
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment