joshuagrochow, No one defines a #matrix as "a thing that transforms like a matrix". Why define tensors that way?
Array=numbers in a (possibly high-dim) grid
Matrix=array representation of a linear map* in a chosen basis
Tensor=array representation of a multilinear map in a chosen basis(* or linear endomorphism, or bilinear function, but we'll get there.)
Vectors=1-tensors, but not all 1-index arrays are vectors
Matrices=2-tensors, but not all 2-ary arrays are matricesSimilarly, not all k-ary arrays are tensors. Some examples:
Christoffel symbols aren't a tensor because they aren't (multi)linear in all of their arguments.
Most "tensors" in #MachineLearning #AI aren't tensors b/c they aren't multilinear - they are just multi-dim arrays of numbers. To say an array is (or represents) a tensor is to endow it with additional multilinear structure, same as with arrays vs matrices vs linear structure.
(1/4)
Add comment