-
Notifications
You must be signed in to change notification settings - Fork 3.1k
fix bug unimo fp16 infer error #4166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Thanks for your contribution! |
Codecov Report
@@ Coverage Diff @@
## develop #4166 +/- ##
========================================
Coverage 33.00% 33.00%
========================================
Files 400 400
Lines 56414 56414
========================================
Hits 18618 18618
Misses 37796 37796 Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
# token_type_ids | ||
paddle.static.InputSpec(shape=[None, None], dtype="int64"), | ||
paddle.static.InputSpec(shape=[None, None], dtype="int32"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里有个问题是,windows默认的numpy是int64,这里的改动是不是对windows的部署有影响?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
更新后的修改只对attention mask进行了精度变换,其他部分仍是64
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Bug fixes
PR changes
Models
Description
修复 unimo-text采用fp16预测报错 #4106