@@ -8,6 +8,8 @@ Return the loss corresponding to mean absolute error:
8
8
# Examples
9
9
10
10
```jldoctest
11
+ julia> using Flux.Losses: mae
12
+
11
13
julia> y_model = [1.1, 1.9, 3.1];
12
14
13
15
julia> mae(y_model, 1:3)
@@ -31,6 +33,8 @@ See also: [`mae`](@ref), [`msle`](@ref), [`crossentropy`](@ref).
31
33
# Examples
32
34
33
35
```jldoctest
36
+ julia> using Flux.Losses: mse
37
+
34
38
julia> y_model = [1.1, 1.9, 3.1];
35
39
36
40
julia> y_true = 1:3;
@@ -57,6 +61,8 @@ Penalizes an under-estimation more than an over-estimatation.
57
61
# Examples
58
62
59
63
```jldoctest
64
+ julia> using Flux.Losses: msle
65
+
60
66
julia> msle(Float32[1.1, 2.2, 3.3], 1:3)
61
67
0.009084041f0
62
68
@@ -113,6 +119,8 @@ of label smoothing to binary distributions encoded in a single number.
113
119
# Examples
114
120
115
121
```jldoctest
122
+ julia> using Flux.Losses: label_smoothing, crossentropy
123
+
116
124
julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
117
125
2×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
118
126
⋅ ⋅ ⋅ 1 ⋅ 1
@@ -179,6 +187,8 @@ See also: [`logitcrossentropy`](@ref), [`binarycrossentropy`](@ref), [`logitbina
179
187
# Examples
180
188
181
189
```jldoctest
190
+ julia> using Flux.Losses: label_smoothing, crossentropy
191
+
182
192
julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
183
193
3×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
184
194
1 ⋅ ⋅ ⋅ 1
@@ -232,6 +242,8 @@ See also: [`binarycrossentropy`](@ref), [`logitbinarycrossentropy`](@ref), [`lab
232
242
# Examples
233
243
234
244
```jldoctest
245
+ julia> using Flux.Losses: crossentropy, logitcrossentropy
246
+
235
247
julia> y_label = onehotbatch(collect("abcabaa"), 'a':'c')
236
248
3×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
237
249
1 ⋅ ⋅ 1 ⋅ 1 1
@@ -273,7 +285,10 @@ computing the loss.
273
285
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
274
286
275
287
# Examples
288
+
276
289
```jldoctest
290
+ julia> using Flux.Losses: binarycrossentropy, crossentropy
291
+
277
292
julia> y_bin = Bool[1,0,1]
278
293
3-element Vector{Bool}:
279
294
1
@@ -314,7 +329,10 @@ Mathematically equivalent to
314
329
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
315
330
316
331
# Examples
332
+
317
333
```jldoctest
334
+ julia> using Flux.Losses: binarycrossentropy, logitbinarycrossentropy
335
+
318
336
julia> y_bin = Bool[1,0,1];
319
337
320
338
julia> y_model = Float32[2, -1, pi]
@@ -348,6 +366,8 @@ from the other. It is always non-negative, and zero only when both the distribut
348
366
# Examples
349
367
350
368
```jldoctest
369
+ julia> using Flux.Losses: kldivergence
370
+
351
371
julia> p1 = [1 0; 0 1]
352
372
2×2 Matrix{Int64}:
353
373
1 0
@@ -467,6 +487,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`binarycrossentropy`](@
467
487
# Examples
468
488
469
489
```jldoctest
490
+ julia> using Flux.Losses: binary_focal_loss
491
+
470
492
julia> y = [0 1 0
471
493
1 0 1]
472
494
2×3 Matrix{Int64}:
@@ -509,6 +531,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`crossentropy`](@ref).
509
531
# Examples
510
532
511
533
```jldoctest
534
+ julia> using Flux.Losses: focal_loss
535
+
512
536
julia> y = [1 0 0 0 1
513
537
0 1 0 1 0
514
538
0 0 1 0 0]
0 commit comments