Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
C
combo
Manage
Activity
Members
Labels
Plan
Issues
20
Issue boards
Milestones
Wiki
Redmine
Code
Merge requests
2
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Syntactic Tools
combo
Merge requests
!36
Release 1.0.4
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
Release 1.0.4
candidate_release_1.0.4
into
develop
Overview
0
Commits
28
Pipelines
1
Changes
21
Merged
Mateusz Klimaszewski
requested to merge
candidate_release_1.0.4
into
develop
3 years ago
Overview
0
Commits
28
Pipelines
1
Changes
2
Expand
0
0
Merge request reports
Viewing commit
5545d968
Prev
Next
Show latest version
2 files
+
4
−
1
Expand all files
Inline
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
Files
2
Search (e.g. *.vue) (Ctrl+P)
5545d968
Enable weighted average of LM embeddings.
· 5545d968
Mateusz Klimaszewski
authored
3 years ago
combo/models/embeddings.py
+
2
−
0
Options
@@ -111,10 +111,12 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm
projection_activation
:
Optional
[
allen_nn
.
Activation
]
=
lambda
x
:
x
,
projection_dropout_rate
:
Optional
[
float
]
=
0.0
,
freeze_transformer
:
bool
=
True
,
last_layer_only
:
bool
=
True
,
tokenizer_kwargs
:
Optional
[
Dict
[
str
,
Any
]]
=
None
,
transformer_kwargs
:
Optional
[
Dict
[
str
,
Any
]]
=
None
):
super
().
__init__
(
model_name
,
train_parameters
=
not
freeze_transformer
,
last_layer_only
=
last_layer_only
,
tokenizer_kwargs
=
tokenizer_kwargs
,
transformer_kwargs
=
transformer_kwargs
)
if
projection_dim
: