Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
C
combo
Manage
Activity
Members
Labels
Plan
Issues
20
Issue boards
Milestones
Wiki
Redmine
Code
Merge requests
2
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Syntactic Tools
combo
Merge requests
!47
Fixed multiword prediction + bug that made the code write empty predictions
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
Fixed multiword prediction + bug that made the code write empty predictions
multiword_fix
into
main
Overview
0
Commits
14
Pipelines
0
Changes
13
Merged
Martyna Wiącek
requested to merge
multiword_fix
into
main
1 year ago
Overview
0
Commits
14
Pipelines
0
Changes
1
Expand
0
0
Merge request reports
Viewing commit
17322ce9
Prev
Next
Show latest version
1 file
+
1
−
1
Expand all files
Inline
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
17322ce9
fix embedding_dim in default_model
· 17322ce9
Martyna Wiącek
authored
1 year ago
combo/default_model.py
+
1
−
1
Options
@@ -160,7 +160,7 @@ def default_model(pretrained_transformer_name: str, vocabulary: Vocabulary, use_
activations
=
[
GELUActivation
(),
GELUActivation
(),
GELUActivation
(),
LinearActivation
()],
char_vocab_namespace
=
"
token_characters
"
,
dilation
=
[
1
,
2
,
4
,
1
],
embedding_dim
=
256
,
embedding_dim
=
300
,
filters
=
[
256
,
256
,
256
],
input_projection_layer
=
Linear
(
activation
=
TanhActivation
(),