Open
Description
OS
Windows
GPU Library
CUDA 12.x
Python version
3.11
Pytorch version
2.8.0
Model
google/gemma-3-27b-it
Describe the bug
When I try to quantize gemma3 myself, I get a blue screen crash due to Memory error. PC automatically restarts on Windows 10.
The error happens right at the end when it tries to save the 3rd shard, which is weird.
I thought maybe, it's because of torch 2.8.0, but I tried to quantize an older model and it worked without issues.
Reproduction steps
Just using convert_exl2.py to create a exl2 quant of gemma3 27b. I could provide measurements.json file if that helps.
Expected behavior
Shards are saved normal.
Logs
N/A
Additional context
No response
Acknowledgements
- I have looked for similar issues before submitting this one.
- I understand that the developers have lives and my issue will be answered when possible.
- I understand the developers of this program are human, and I will ask my questions politely.