Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
C
combo
Manage
Activity
Members
Labels
Plan
Issues
20
Issue boards
Milestones
Wiki
Redmine
Code
Merge requests
2
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Syntactic Tools
combo
Commits
761b4a25
Commit
761b4a25
authored
5 years ago
by
Mateusz Klimaszewski
Browse files
Options
Downloads
Patches
Plain Diff
Fix masked cross entropy.
parent
08bf1a1e
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
combo/models/lemma.py
+0
-1
0 additions, 1 deletion
combo/models/lemma.py
combo/models/utils.py
+1
-2
1 addition, 2 deletions
combo/models/utils.py
with
1 addition
and
3 deletions
combo/models/lemma.py
+
0
−
1
View file @
761b4a25
...
...
@@ -80,7 +80,6 @@ class LemmatizerModel(base.Predictor):
@classmethod
def
from_vocab
(
cls
,
vocab
:
data
.
Vocabulary
,
char_vocab_namespace
:
str
,
lemma_vocab_namespace
:
str
,
...
...
This diff is collapsed.
Click to expand it.
combo/models/utils.py
+
1
−
2
View file @
761b4a25
...
...
@@ -3,6 +3,5 @@ import torch.nn.functional as F
def
masked_cross_entropy
(
pred
:
torch
.
Tensor
,
true
:
torch
.
Tensor
,
mask
:
torch
.
BoolTensor
)
->
torch
.
Tensor
:
mask
=
mask
.
float
().
unsqueeze
(
-
1
)
pred
=
pred
+
(
mask
+
1e-45
).
log
()
pred
=
pred
+
(
mask
.
float
().
unsqueeze
(
-
1
)
+
1e-45
).
log
()
return
F
.
cross_entropy
(
pred
,
true
,
reduction
=
'
none
'
)
*
mask
This diff is collapsed.
Click to expand it.
Preview
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment