From 4691825fcfc8d4e6673d3b0cfffbd44b652f967b Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Mon, 12 Feb 2024 03:19:41 +0000 Subject: [PATCH] build based on 83a840a --- dev/index.html | 18 +++++++++--------- dev/search/index.html | 2 +- dev/search_index.js | 2 +- 3 files changed, 11 insertions(+), 11 deletions(-) diff --git a/dev/index.html b/dev/index.html index a2a80f4..a550e51 100644 --- a/dev/index.html +++ b/dev/index.html @@ -3,19 +3,19 @@ a ⊙ b

For arrays a and b, perform elementwise multiplication. a and b must have identical axes.

can be passed as an operator to higher-order functions.

Examples

julia> a = [2, 3]; b = [5, 7];
 
 julia> a ⊙ b
-2-element Array{Int64,1}:
+2-element Vector{Int64}:
  10
  21
 
 julia> a ⊙ [5]
-ERROR: DimensionMismatch("Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)")
-[...]

See also hadamard!(y, a, b).

source
TensorCore.hadamard!Function
hadamard!(dest, A, B)

Similar to hadamard(A, B) (which can also be written A ⊙ B), but stores its results in the pre-allocated array dest.

source
TensorCore.tensorFunction
tensor(A, B)
+ERROR: DimensionMismatch: Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)
+[...]

See also hadamard!(y, a, b).

source
TensorCore.hadamard!Function
hadamard!(dest, A, B)

Similar to hadamard(A, B) (which can also be written A ⊙ B), but stores its results in the pre-allocated array dest.

source
TensorCore.tensorFunction
tensor(A, B)
 A ⊗ B

Compute the tensor product of A and B. If C = A ⊗ B, then C[i1, ..., im, j1, ..., jn] = A[i1, ... im] * B[j1, ..., jn].

For vectors v and w, the Kronecker product is related to the tensor product by kron(v,w) == vec(w ⊗ v) or w ⊗ v == reshape(kron(v,w), (length(w), length(v))).

Examples

julia> a = [2, 3]; b = [5, 7, 11];
 
 julia> a ⊗ b
-2×3 Array{Int64,2}:
+2×3 Matrix{Int64}:
  10  14  22
- 15  21  33

See also tensor!(Y,A,B).

source
TensorCore.tensor!Function
tensor!(dest, A, B)

Similar to tensor(A, B) (which can also be written A ⊗ B), but stores its results in the pre-allocated array dest.

source
TensorCore.boxdotFunction
boxdot(A,B) = A ⊡ B    # \boxdot

Generalised matrix multiplication: Contracts the last dimension of A with the first dimension of B, for any ndims(A) & ndims(B). If both are vectors, then it returns a scalar == sum(A .* B).

Examples

julia> A = rand(3,4,5); B = rand(5,6,7);
+ 15  21  33

See also tensor!(Y,A,B).

source
TensorCore.tensor!Function
tensor!(dest, A, B)

Similar to tensor(A, B) (which can also be written A ⊗ B), but stores its results in the pre-allocated array dest.

source
TensorCore.boxdotFunction
boxdot(A,B) = A ⊡ B    # \boxdot

Generalised matrix multiplication: Contracts the last dimension of A with the first dimension of B, for any ndims(A) & ndims(B). If both are vectors, then it returns a scalar == sum(A .* B).

Examples

julia> A = rand(3,4,5); B = rand(5,6,7);
 
 julia> size(A ⊡ B)
 (3, 4, 6, 7)
@@ -27,13 +27,13 @@
 DimensionMismatch("neighbouring axes of `A` and `B` must match, got Base.OneTo(7) and Base.OneTo(3)")

This is the same behaviour as Mathematica's function Dot[A, B]. It is not identicaly to Python's numpy.dot(A, B), which contracts with the second-last dimension of B instead of the first, but both keep all the other dimensions. Unlike Julia's LinearAlgebra.dot, it does not conjugate A, so these two agree only for real-valued vectors.

When interacting with Adjoint vectors, this always obeys (x ⊡ y)' == y' ⊡ x', and hence may sometimes return another Adjoint vector. (And similarly for Transpose.)

julia> M = rand(5,5); v = rand(5);
 
 julia> typeof(v ⊡ M')
-Array{Float64,1}
+Vector{Float64} (alias for Array{Float64, 1})
 
 julia> typeof(M ⊡ v')  # adjoint of the previous line
-Adjoint{Float64,Array{Float64,1}}
+LinearAlgebra.Adjoint{Float64, Vector{Float64}}
 
 julia> typeof(v' ⊡ M')  # same as *, and equal to adjoint(M ⊡ v)
-Adjoint{Float64,Array{Float64,1}}
+LinearAlgebra.Adjoint{Float64, Vector{Float64}}
 
 julia> typeof(v' ⊡ v)
-Float64

See also boxdot!(Y,A,B), which is to as mul! is to *.

source
TensorCore.boxdot!Function
boxdot!(Y, A, B, α=1, β=0)

In-place version of boxdot, i.e. Y .= (A ⊡ B) .* β .+ Y .* α. Like 5-argument mul!, the use of α, β here requires Julia 1.3 or later.

source
+Float64

See also boxdot!(Y,A,B), which is to as mul! is to *.

source
TensorCore.boxdot!Function
boxdot!(Y, A, B, α=1, β=0)

In-place version of boxdot, i.e. Y .= (A ⊡ B) .* β .+ Y .* α. Like 5-argument mul!, the use of α, β here requires Julia 1.3 or later.

source
diff --git a/dev/search/index.html b/dev/search/index.html index 9f07941..6695cd5 100644 --- a/dev/search/index.html +++ b/dev/search/index.html @@ -1,2 +1,2 @@ -Search · TensorCore.jl

Loading search...

    +Search · TensorCore.jl

    Loading search...

      diff --git a/dev/search_index.js b/dev/search_index.js index f69ea0b..75e1a00 100644 --- a/dev/search_index.js +++ b/dev/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"#TensorCore.jl","page":"Home","title":"TensorCore.jl","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"This package is intended as a lightweight foundation for tensor operations across the Julia ecosystem. Currently it exports three operations, hadamard, tensor and boxdot, and corresponding unicode operators ⊙, ⊗ and ⊡.","category":"page"},{"location":"#API","page":"Home","title":"API","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"","category":"page"},{"location":"","page":"Home","title":"Home","text":"hadamard\nhadamard!\ntensor\ntensor!\nboxdot\nboxdot!","category":"page"},{"location":"#TensorCore.hadamard","page":"Home","title":"TensorCore.hadamard","text":"hadamard(a, b)\na ⊙ b\n\nFor arrays a and b, perform elementwise multiplication. a and b must have identical axes.\n\n⊙ can be passed as an operator to higher-order functions.\n\nExamples\n\njulia> a = [2, 3]; b = [5, 7];\n\njulia> a ⊙ b\n2-element Array{Int64,1}:\n 10\n 21\n\njulia> a ⊙ [5]\nERROR: DimensionMismatch(\"Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)\")\n[...]\n\nSee also hadamard!(y, a, b).\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.hadamard!","page":"Home","title":"TensorCore.hadamard!","text":"hadamard!(dest, A, B)\n\nSimilar to hadamard(A, B) (which can also be written A ⊙ B), but stores its results in the pre-allocated array dest.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.tensor","page":"Home","title":"TensorCore.tensor","text":"tensor(A, B)\nA ⊗ B\n\nCompute the tensor product of A and B. If C = A ⊗ B, then C[i1, ..., im, j1, ..., jn] = A[i1, ... im] * B[j1, ..., jn].\n\nFor vectors v and w, the Kronecker product is related to the tensor product by kron(v,w) == vec(w ⊗ v) or w ⊗ v == reshape(kron(v,w), (length(w), length(v))).\n\nExamples\n\njulia> a = [2, 3]; b = [5, 7, 11];\n\njulia> a ⊗ b\n2×3 Array{Int64,2}:\n 10 14 22\n 15 21 33\n\nSee also tensor!(Y,A,B).\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.tensor!","page":"Home","title":"TensorCore.tensor!","text":"tensor!(dest, A, B)\n\nSimilar to tensor(A, B) (which can also be written A ⊗ B), but stores its results in the pre-allocated array dest.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.boxdot","page":"Home","title":"TensorCore.boxdot","text":"boxdot(A,B) = A ⊡ B # \\boxdot\n\nGeneralised matrix multiplication: Contracts the last dimension of A with the first dimension of B, for any ndims(A) & ndims(B). If both are vectors, then it returns a scalar == sum(A .* B).\n\nExamples\n\njulia> A = rand(3,4,5); B = rand(5,6,7);\n\njulia> size(A ⊡ B)\n(3, 4, 6, 7)\n\njulia> typeof(rand(5) ⊡ rand(5))\nFloat64\n\njulia> try B ⊡ A catch err println(err) end\nDimensionMismatch(\"neighbouring axes of `A` and `B` must match, got Base.OneTo(7) and Base.OneTo(3)\")\n\nThis is the same behaviour as Mathematica's function Dot[A, B]. It is not identicaly to Python's numpy.dot(A, B), which contracts with the second-last dimension of B instead of the first, but both keep all the other dimensions. Unlike Julia's LinearAlgebra.dot, it does not conjugate A, so these two agree only for real-valued vectors.\n\nWhen interacting with Adjoint vectors, this always obeys (x ⊡ y)' == y' ⊡ x', and hence may sometimes return another Adjoint vector. (And similarly for Transpose.)\n\njulia> M = rand(5,5); v = rand(5);\n\njulia> typeof(v ⊡ M')\nArray{Float64,1}\n\njulia> typeof(M ⊡ v') # adjoint of the previous line\nAdjoint{Float64,Array{Float64,1}}\n\njulia> typeof(v' ⊡ M') # same as *, and equal to adjoint(M ⊡ v)\nAdjoint{Float64,Array{Float64,1}}\n\njulia> typeof(v' ⊡ v)\nFloat64\n\nSee also boxdot!(Y,A,B), which is to ⊡ as mul! is to *.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.boxdot!","page":"Home","title":"TensorCore.boxdot!","text":"boxdot!(Y, A, B, α=1, β=0)\n\nIn-place version of boxdot, i.e. Y .= (A ⊡ B) .* β .+ Y .* α. Like 5-argument mul!, the use of α, β here requires Julia 1.3 or later.\n\n\n\n\n\n","category":"function"}] +[{"location":"#TensorCore.jl","page":"Home","title":"TensorCore.jl","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"This package is intended as a lightweight foundation for tensor operations across the Julia ecosystem. Currently it exports three operations, hadamard, tensor and boxdot, and corresponding unicode operators ⊙, ⊗ and ⊡.","category":"page"},{"location":"#API","page":"Home","title":"API","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"","category":"page"},{"location":"","page":"Home","title":"Home","text":"hadamard\nhadamard!\ntensor\ntensor!\nboxdot\nboxdot!","category":"page"},{"location":"#TensorCore.hadamard","page":"Home","title":"TensorCore.hadamard","text":"hadamard(a, b)\na ⊙ b\n\nFor arrays a and b, perform elementwise multiplication. a and b must have identical axes.\n\n⊙ can be passed as an operator to higher-order functions.\n\nExamples\n\njulia> a = [2, 3]; b = [5, 7];\n\njulia> a ⊙ b\n2-element Vector{Int64}:\n 10\n 21\n\njulia> a ⊙ [5]\nERROR: DimensionMismatch: Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)\n[...]\n\nSee also hadamard!(y, a, b).\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.hadamard!","page":"Home","title":"TensorCore.hadamard!","text":"hadamard!(dest, A, B)\n\nSimilar to hadamard(A, B) (which can also be written A ⊙ B), but stores its results in the pre-allocated array dest.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.tensor","page":"Home","title":"TensorCore.tensor","text":"tensor(A, B)\nA ⊗ B\n\nCompute the tensor product of A and B. If C = A ⊗ B, then C[i1, ..., im, j1, ..., jn] = A[i1, ... im] * B[j1, ..., jn].\n\nFor vectors v and w, the Kronecker product is related to the tensor product by kron(v,w) == vec(w ⊗ v) or w ⊗ v == reshape(kron(v,w), (length(w), length(v))).\n\nExamples\n\njulia> a = [2, 3]; b = [5, 7, 11];\n\njulia> a ⊗ b\n2×3 Matrix{Int64}:\n 10 14 22\n 15 21 33\n\nSee also tensor!(Y,A,B).\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.tensor!","page":"Home","title":"TensorCore.tensor!","text":"tensor!(dest, A, B)\n\nSimilar to tensor(A, B) (which can also be written A ⊗ B), but stores its results in the pre-allocated array dest.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.boxdot","page":"Home","title":"TensorCore.boxdot","text":"boxdot(A,B) = A ⊡ B # \\boxdot\n\nGeneralised matrix multiplication: Contracts the last dimension of A with the first dimension of B, for any ndims(A) & ndims(B). If both are vectors, then it returns a scalar == sum(A .* B).\n\nExamples\n\njulia> A = rand(3,4,5); B = rand(5,6,7);\n\njulia> size(A ⊡ B)\n(3, 4, 6, 7)\n\njulia> typeof(rand(5) ⊡ rand(5))\nFloat64\n\njulia> try B ⊡ A catch err println(err) end\nDimensionMismatch(\"neighbouring axes of `A` and `B` must match, got Base.OneTo(7) and Base.OneTo(3)\")\n\nThis is the same behaviour as Mathematica's function Dot[A, B]. It is not identicaly to Python's numpy.dot(A, B), which contracts with the second-last dimension of B instead of the first, but both keep all the other dimensions. Unlike Julia's LinearAlgebra.dot, it does not conjugate A, so these two agree only for real-valued vectors.\n\nWhen interacting with Adjoint vectors, this always obeys (x ⊡ y)' == y' ⊡ x', and hence may sometimes return another Adjoint vector. (And similarly for Transpose.)\n\njulia> M = rand(5,5); v = rand(5);\n\njulia> typeof(v ⊡ M')\nVector{Float64} (alias for Array{Float64, 1})\n\njulia> typeof(M ⊡ v') # adjoint of the previous line\nLinearAlgebra.Adjoint{Float64, Vector{Float64}}\n\njulia> typeof(v' ⊡ M') # same as *, and equal to adjoint(M ⊡ v)\nLinearAlgebra.Adjoint{Float64, Vector{Float64}}\n\njulia> typeof(v' ⊡ v)\nFloat64\n\nSee also boxdot!(Y,A,B), which is to ⊡ as mul! is to *.\n\n\n\n\n\n","category":"function"},{"location":"#TensorCore.boxdot!","page":"Home","title":"TensorCore.boxdot!","text":"boxdot!(Y, A, B, α=1, β=0)\n\nIn-place version of boxdot, i.e. Y .= (A ⊡ B) .* β .+ Y .* α. Like 5-argument mul!, the use of α, β here requires Julia 1.3 or later.\n\n\n\n\n\n","category":"function"}] }