Skip to content

Commit 4c64e31

Browse files
committed
2 parents 1c0bc17 + 0301190 commit 4c64e31

File tree

1 file changed

+0
-12
lines changed

1 file changed

+0
-12
lines changed

docs/source/en/model_doc/altclip.md

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,6 @@ rendered properly in your Markdown viewer.
1717
<div style="float: right;">
1818
<div class="flex flex-wrap space-x-1">
1919
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
20-
<img alt="Transformers" src="https://img.shields.io/badge/Transformers-6B5B95?style=flat&logo=transformers&logoColor=white">
21-
</div>
2220
</div>
2321

2422
# AltCLIP
@@ -34,11 +32,6 @@ The examples below demonstrates how to calculate similarity scores between an im
3432

3533
<hfoptions id="usage">
3634

37-
<hfoption id="Pipeline">
38-
39-
`pipeline()` isn’t available because altCLIP currently exposes only its contrastive head (image × text similarity) and no dedicated *visual-question-answering* or *captioning* head.
40-
Use the AutoModel path shown below instead.
41-
4235
</hfoption>
4336

4437
<hfoption id="AutoModel">
@@ -76,9 +69,7 @@ AltCLIP does **not** require `transformers-cli` at inference time, but the tool
7669

7770
</hfoptions>
7871

79-
---
8072

81-
## Quantization
8273

8374
Quantization reduces the memory burden of large models by representing the weights in a lower precision. Refer to the [Quantization](https://huggingface.co/docs/transformers/main/en/quantization/overview) overview for more available quantization backends.
8475

@@ -113,12 +104,9 @@ for label, prob in zip(labels, probs[0]):
113104
print(f"{label}: {prob.item():.4f}")
114105
```
115106

116-
On a typical machine the INT-8 checkpoint occupies **≈ ½ the RAM** of the full-precision model with negligible accuracy drop.
117107

118-
> ℹ️ Embedding layers can be quantized with *float-qparams weight-only* configs once PyTorch exposes them via the public API.
119108

120109

121-
---
122110

123111
## Attention visualisation
124112

0 commit comments

Comments
 (0)