TensorCore.hadamard!
— Functionhadamard!(dest, A, B)
Similar to hadamard(A, B)
(which can also be written A ⊙ B
), but stores its results in the pre-allocated array dest
.
diff --git a/dev/index.html b/dev/index.html index a2a80f4..a550e51 100644 --- a/dev/index.html +++ b/dev/index.html @@ -3,19 +3,19 @@ a ⊙ b
For arrays a
and b
, perform elementwise multiplication. a
and b
must have identical axes
.
⊙
can be passed as an operator to higher-order functions.
Examples
julia> a = [2, 3]; b = [5, 7];
julia> a ⊙ b
-2-element Array{Int64,1}:
+2-element Vector{Int64}:
10
21
julia> a ⊙ [5]
-ERROR: DimensionMismatch("Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)")
-[...]
See also hadamard!(y, a, b)
.
TensorCore.hadamard!
— Functionhadamard!(dest, A, B)
Similar to hadamard(A, B)
(which can also be written A ⊙ B
), but stores its results in the pre-allocated array dest
.
TensorCore.tensor
— Functiontensor(A, B)
+ERROR: DimensionMismatch: Axes of `A` and `B` must match, got (Base.OneTo(2),) and (Base.OneTo(1),)
+[...]
See also hadamard!(y, a, b)
.
TensorCore.hadamard!
— Functionhadamard!(dest, A, B)
Similar to hadamard(A, B)
(which can also be written A ⊙ B
), but stores its results in the pre-allocated array dest
.
TensorCore.tensor
— Functiontensor(A, B)
A ⊗ B
Compute the tensor product of A
and B
. If C = A ⊗ B
, then C[i1, ..., im, j1, ..., jn] = A[i1, ... im] * B[j1, ..., jn]
.
For vectors v
and w
, the Kronecker product is related to the tensor product by kron(v,w) == vec(w ⊗ v)
or w ⊗ v == reshape(kron(v,w), (length(w), length(v)))
.
Examples
julia> a = [2, 3]; b = [5, 7, 11];
julia> a ⊗ b
-2×3 Array{Int64,2}:
+2×3 Matrix{Int64}:
10 14 22
- 15 21 33
See also tensor!(Y,A,B)
.
TensorCore.tensor!
— Functiontensor!(dest, A, B)
Similar to tensor(A, B)
(which can also be written A ⊗ B
), but stores its results in the pre-allocated array dest
.
TensorCore.boxdot
— Functionboxdot(A,B) = A ⊡ B # \boxdot
Generalised matrix multiplication: Contracts the last dimension of A
with the first dimension of B
, for any ndims(A)
& ndims(B)
. If both are vectors, then it returns a scalar == sum(A .* B)
.
Examples
julia> A = rand(3,4,5); B = rand(5,6,7);
+ 15 21 33
See also tensor!(Y,A,B)
.
TensorCore.tensor!
— Functiontensor!(dest, A, B)
Similar to tensor(A, B)
(which can also be written A ⊗ B
), but stores its results in the pre-allocated array dest
.
TensorCore.boxdot
— Functionboxdot(A,B) = A ⊡ B # \boxdot
Generalised matrix multiplication: Contracts the last dimension of A
with the first dimension of B
, for any ndims(A)
& ndims(B)
. If both are vectors, then it returns a scalar == sum(A .* B)
.
Examples
julia> A = rand(3,4,5); B = rand(5,6,7);
julia> size(A ⊡ B)
(3, 4, 6, 7)
@@ -27,13 +27,13 @@
DimensionMismatch("neighbouring axes of `A` and `B` must match, got Base.OneTo(7) and Base.OneTo(3)")
This is the same behaviour as Mathematica's function Dot[A, B]
. It is not identicaly to Python's numpy.dot(A, B)
, which contracts with the second-last dimension of B
instead of the first, but both keep all the other dimensions. Unlike Julia's LinearAlgebra.dot
, it does not conjugate A
, so these two agree only for real-valued vectors.
When interacting with Adjoint
vectors, this always obeys (x ⊡ y)' == y' ⊡ x'
, and hence may sometimes return another Adjoint
vector. (And similarly for Transpose
.)
julia> M = rand(5,5); v = rand(5);
julia> typeof(v ⊡ M')
-Array{Float64,1}
+Vector{Float64} (alias for Array{Float64, 1})
julia> typeof(M ⊡ v') # adjoint of the previous line
-Adjoint{Float64,Array{Float64,1}}
+LinearAlgebra.Adjoint{Float64, Vector{Float64}}
julia> typeof(v' ⊡ M') # same as *, and equal to adjoint(M ⊡ v)
-Adjoint{Float64,Array{Float64,1}}
+LinearAlgebra.Adjoint{Float64, Vector{Float64}}
julia> typeof(v' ⊡ v)
-Float64
See also boxdot!(Y,A,B)
, which is to ⊡
as mul!
is to *
.
TensorCore.boxdot!
— Functionboxdot!(Y, A, B, α=1, β=0)
In-place version of boxdot
, i.e. Y .= (A ⊡ B) .* β .+ Y .* α
. Like 5-argument mul!
, the use of α, β
here requires Julia 1.3 or later.
Settings
This document was generated with Documenter.jl on Saturday 26 December 2020. Using Julia version 1.5.3.