joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

No one defines a #matrix as "a thing that transforms like a matrix". Why define tensors that way?

Array=numbers in a (possibly high-dim) grid
Matrix=array representation of a linear map* in a chosen basis
Tensor=array representation of a multilinear map in a chosen basis

(* or linear endomorphism, or bilinear function, but we'll get there.)

Vectors=1-tensors, but not all 1-index arrays are vectors
Matrices=2-tensors, but not all 2-ary arrays are matrices

Similarly, not all k-ary arrays are tensors. Some examples:

Christoffel symbols aren't a tensor because they aren't (multi)linear in all of their arguments.

Most "tensors" in #MachineLearning #AI aren't tensors b/c they aren't multilinear - they are just multi-dim arrays of numbers. To say an array is (or represents) a tensor is to endow it with additional multilinear structure, same as with arrays vs matrices vs linear structure.

(1/4)

#tensors #matrix #algebra

joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

Oops.

I just found a great chapter in the Handbook of Linear Algebra by Lek-Heng Lim: https://www.stat.uchicago.edu/~lekheng/work/hla.pdf

where he says "tensor" is the generic term for multilinear things, and "hypermatrix" should be what I called "tensor" in this thread. Maybe I agree w/ that!

mrdk,
@mrdk@mathstodon.xyz avatar

@joshuagrochow “No one defines a #matrix as ‘a thing that transforms like a matrix’” — maybe because in order to define a transformation, you already need to have a matrix? 🤔

joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

@mrdk I don't think so (you can define linear transformation etc. without choosing a basis, but a matrix involves a choice of basis), but I see your point.

joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

Now, how do we get the whole "a tensor is a thing that transforms like a tensor"? Well, let's start with matrices. How a matrix changes under change of basis tells you what kind of multilinear thing the matrix is representing, and the same is true of tensors. Examples:

If a matrix M represents a linear map L:V→W, then when we change basis in V by an invertible matrix A in GL(V), and change basis in W by an invertible B in GL(W), then M changes to B M A^{-1} (where I'm writing my inputs as column vectors on the right).

In contrast, if a matrix M represents a linear endomorphism L:V→V, then when we change basis in V by an invertible matrix A in GL(V), M becomes AMA^{-1}.

If a matrix M represents a bilinear map V⊗V→F (by (x,y)→x^t M y), then under change of basis A^{-1}, M becomes A^t M A.

(2/4)

#tensors #matrix #algebra

joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

How the matrix transforms is "equivalent data" to "what kind of multilinear thing the matrix represents."

(3/4)

#tensors #matrix #algebra

joshuagrochow,
@joshuagrochow@mathstodon.xyz avatar

e.g. if I tell you I have a matrix M and under change of basis it transforms as A^t M A, then I know it's representing a bilinear map of the form V⊗V→F. etc.

Similarly, if I tell you what kind of multilinear "thing" a tensor T is representing, then that tells you how it transforms under change of basis, and vice versa. For 3-tensors, there are several natural possibilities (up to permuting indices):

U⊗V⊗W→F
U⊗U⊗V→F
U⊗U⊗U→F (trilinear map)
U⊗V→W (bilinear map)
U⊗U→V (bilinear map)
U⊗V→U (linear action of V on U)
U⊗U→U (algebra, not nec. associative)
U→V⊗W
U→U⊗V (coaction)
U→U⊗U (coalgebra, not nec. coassociative)
F→U⊗V⊗W
F→U⊗U⊗V
F→U⊗U⊗U

(4/4)

#tensors #matrix #algebra

  • All
  • Subscribed
  • Moderated
  • Favorites
  • Matrix
  • DreamBathrooms
  • ngwrru68w68
  • cubers
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • InstantRegret
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • mdbf
  • tacticalgear
  • JUstTest
  • osvaldo12
  • normalnudes
  • tester
  • cisconetworking
  • everett
  • GTA5RPClips
  • ethstaker
  • anitta
  • Leos
  • provamag3
  • modclub
  • megavids
  • lostlight
  • All magazines