Skip to content

Commit fb2d834

Browse files
improve docs
1 parent 590a209 commit fb2d834

File tree

4 files changed

+27
-3
lines changed

4 files changed

+27
-3
lines changed

.github/workflows/ci.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -70,6 +70,7 @@ jobs:
7070
using Documenter
7171
using Documenter: doctest
7272
DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive=true)
73+
DocMeta.setdocmeta!(Flux.Losses, :DocTestFilters, :(r"[0-9\.]+f0"); recursive=true)
7374
doctest(Flux)'
7475
- run: julia --project=docs docs/make.jl
7576
env:

docs/make.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
using Documenter, Flux, NNlib, Functors, MLUtils
22

33
DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive = true)
4-
DocMeta.setdocmeta!(Flux.Losses, :DocTestSetup, :(using Flux.Losses); recursive = true)
54

65
# In the Losses module, doctests which differ in the printed Float32 values won't fail
76
DocMeta.setdocmeta!(Flux.Losses, :DocTestFilters, :(r"[0-9\.]+f0"); recursive = true)

docs/src/models/losses.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ loss(ŷ, y)
1919

2020
They are commonly passed as arrays of size `num_target_features x num_examples_in_batch`.
2121

22-
Most loss functions in Flux have an optional argument `agg`, denoting the type of aggregation performed over the
23-
batch:
22+
Most losses in Flux have an optional argument `agg` accepting a function to be used as
23+
as a final aggregation:
2424

2525
```julia
2626
loss(ŷ, y) # defaults to `mean`

src/losses/functions.jl

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ Return the loss corresponding to mean absolute error:
88
# Examples
99
1010
```jldoctest
11+
julia> using Flux.Losses: mae
12+
1113
julia> y_model = [1.1, 1.9, 3.1];
1214
1315
julia> mae(y_model, 1:3)
@@ -31,6 +33,8 @@ See also: [`mae`](@ref), [`msle`](@ref), [`crossentropy`](@ref).
3133
# Examples
3234
3335
```jldoctest
36+
julia> using Flux.Losses: mse
37+
3438
julia> y_model = [1.1, 1.9, 3.1];
3539
3640
julia> y_true = 1:3;
@@ -57,6 +61,8 @@ Penalizes an under-estimation more than an over-estimatation.
5761
# Examples
5862
5963
```jldoctest
64+
julia> using Flux.Losses: msle
65+
6066
julia> msle(Float32[1.1, 2.2, 3.3], 1:3)
6167
0.009084041f0
6268
@@ -113,6 +119,8 @@ of label smoothing to binary distributions encoded in a single number.
113119
# Examples
114120
115121
```jldoctest
122+
julia> using Flux.Losses: label_smoothing, crossentropy
123+
116124
julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
117125
2×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
118126
⋅ ⋅ ⋅ 1 ⋅ 1
@@ -179,6 +187,8 @@ See also: [`logitcrossentropy`](@ref), [`binarycrossentropy`](@ref), [`logitbina
179187
# Examples
180188
181189
```jldoctest
190+
julia> using Flux.Losses: label_smoothing, crossentropy
191+
182192
julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
183193
3×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
184194
1 ⋅ ⋅ ⋅ 1
@@ -232,6 +242,8 @@ See also: [`binarycrossentropy`](@ref), [`logitbinarycrossentropy`](@ref), [`lab
232242
# Examples
233243
234244
```jldoctest
245+
julia> using Flux.Losses: crossentropy, logitcrossentropy
246+
235247
julia> y_label = onehotbatch(collect("abcabaa"), 'a':'c')
236248
3×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
237249
1 ⋅ ⋅ 1 ⋅ 1 1
@@ -273,7 +285,10 @@ computing the loss.
273285
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
274286
275287
# Examples
288+
276289
```jldoctest
290+
julia> using Flux.Losses: binarycrossentropy, crossentropy
291+
277292
julia> y_bin = Bool[1,0,1]
278293
3-element Vector{Bool}:
279294
1
@@ -314,7 +329,10 @@ Mathematically equivalent to
314329
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
315330
316331
# Examples
332+
317333
```jldoctest
334+
julia> using Flux.Losses: binarycrossentropy, logitbinarycrossentropy
335+
318336
julia> y_bin = Bool[1,0,1];
319337
320338
julia> y_model = Float32[2, -1, pi]
@@ -348,6 +366,8 @@ from the other. It is always non-negative, and zero only when both the distribut
348366
# Examples
349367
350368
```jldoctest
369+
julia> using Flux.Losses: kldivergence
370+
351371
julia> p1 = [1 0; 0 1]
352372
2×2 Matrix{Int64}:
353373
1 0
@@ -467,6 +487,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`binarycrossentropy`](@
467487
# Examples
468488
469489
```jldoctest
490+
julia> using Flux.Losses: binary_focal_loss
491+
470492
julia> y = [0 1 0
471493
1 0 1]
472494
2×3 Matrix{Int64}:
@@ -509,6 +531,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`crossentropy`](@ref).
509531
# Examples
510532
511533
```jldoctest
534+
julia> using Flux.Losses: focal_loss
535+
512536
julia> y = [1 0 0 0 1
513537
0 1 0 1 0
514538
0 0 1 0 0]

0 commit comments

Comments
 (0)