Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
[Bug fixes] Fix ring attention #8740
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uh oh!
There was an error while loading. Please reload this page.
[Bug fixes] Fix ring attention #8740
Changes from 1 commit
4cd6dbc
10cccd9
812d4cc
File filter
Filter by extension
Conversations
Uh oh!
There was an error while loading. Please reload this page.
Jump to
Uh oh!
There was an error while loading. Please reload this page.
There are no files selected for viewing
Check warning on line 59 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L58-L59
Check warning on line 62 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L61-L62
Check warning on line 83 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L82-L83
Check warning on line 88 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L87-L88
Check warning on line 91 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L91
Check warning on line 126 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L126
Check warning on line 148 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L148
Check warning on line 151 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L150-L151
Check warning on line 153 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L153
Check warning on line 160 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L159-L160
Check warning on line 175 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L174-L175
Check warning on line 190 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L190
Check warning on line 198 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L198
Check warning on line 229 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L226-L229
Check warning on line 253 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L253
Check warning on line 259 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L259
Check warning on line 273 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L273
Check warning on line 287 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L287
Check warning on line 326 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L326
Check warning on line 329 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L329
Check warning on line 338 in paddlenlp/transformers/ring_flash_attention.py
paddlenlp/transformers/ring_flash_attention.py#L338
Uh oh!
There was an error while loading. Please reload this page.