TensorFlux.jl

Differential geometry with mathematical notation in Julia

Metric

The metric is a special type of (0, 2)-tensor, because it encodes the concept of distances and angles, and provides a mapping between vectors and covectors. It is formed by the dot product of the basis vectors.

To create the metric, you need a basis first. Then, you can use the metric() function. For the standard basis and standard inner product in Euclidean space, the entries of the metric equal the Kronecker delta.

julia

                julia> std_euclidean = Basis([
                           Tensor([1, 0]),
                           Tensor([0, 1])
                       ])
                julia> g = metric(std_euclidean)
                (0, 2)-Tensor:
                [1 0; 0 1]
                    (:co, :co)
                

For the standard basis on a sphere, the metric varies over the surface and towards the poles. We can find the metric by using Symbolics.jl

julia

                julia> using Symbolics
                julia> @variables θ φ
                julia> std_sphere = Basis([
                           Tensor([1, 0]),
                           Tensor([0, sin(θ)])
                       ])
                julia> g = metric(std_sphere)
                (0, 2)-Tensor:
                Real[1 0; 0 sin(θ)^2]
                    (:co, :co)
                

You can also pass in your own inner product. A useful one for physics is the Minkowski metric. TensorFlux.jl includes minkowsi(), so the Minkowski metric is given by

julia

                julia> std_spacetime = Basis([
                           Tensor([1, 0, 0, 0]),
                           Tensor([0, 1, 0, 0]),
                           Tensor([0, 0, 1, 0]),
                           Tensor([0, 0, 0, 1])
                       ])
                julia> η = metric(std_spacetime, minkowski)
                (0, 2)-Tensor:
                [-1 0 0 0; 0 1 0 0; 0 0 1 0; 0 0 0 1]
                    (:co, :co)
                

The metric also provides a map between vectors and covectors through a process called index raising and lowering. In standard Euclidean space with the standard inner product, the metric is the identity, so a vector and its corresponding covector have the same components, but this is not true in general.

Index lowering maps vectors to covectors by contracting with the metric. Index raising maps covectors to vectors by contracting with the inverse metric, which can be found using inv(). This process is invertible, as we would expect.

julia

                julia> e′ = Basis([
                           Tensor([1, 0]),
                           Tensor([-1, 1]),
                       ])
                julia> g = metric(e′) # Metric in the e′ basis
                julia> G = inv(g) # Inverse metric
                julia> v = Tensor([1, 3]) # Components in the e′ basis
                julia> ω = (g[:i, :j] * v[:i]).tensor # Map v to a covector
                (0, 1)-Tensor:
                [-2, 5]
                    (:co,)
                julia> u = (G[:k, :l] * ω[:k]).tensor # Map the covector back to a vector
                (1, 0)-Tensor:
                Num[1, 3]
                    (:contra,)
                

Contraction with leftover indices returns an IndexedTensor to be used in subsequent contractions, so .tensor returns the underlying Tensor.

To get the components in the standard basis, you can contract with the basis.

julia

                julia> v[:i] * e′[:i]
                (1, 0)-Tensor:
                [-2, 3]
                    (:contra,)
                

Note that these are also the components of ω in the standard basis, because the metric is the Kronecker delta.

Derivatives

The PartialDerivative struct allows for contraction with tensors. To initialize, you pass it coordinates. Then contracting an (m, n)-tensor with ∂ returns an (m, n + 1)-tensor, where each index is differentiated with respect to the ith coordinate of ∂.

julia

                julia> @variables u v
                julia> ∂ = PartialDerivative((u, v))
                julia> X = Tensor([u * v, v^2 - 1])
                julia> ∂[:i] * X[:j]
                (1, 1)-Tensor:
                Num[v u; 0 2v]
                    (:contra, :co)
                    (:j,), (:i,)
                

Partial differentiation differentiates only the tensor components. For a non-constant basis, use the covariant derivative found in the next section.

Connections

The covariant derivative, also called the connection, differentiates tensors with non-constant basis vectors. One important example is the Levi-Civita connection, the unique torsion-free and metric-compatible connection on a manifold.

The CovariantDerivative struct takes in connection coefficients and a partial derivative. For the Levi-Civia connection, christoffel() returns the connection coefficients.

julia

                julia> @variables θ φ
                julia> e = Basis([
                           Tensor([1, 0]),
                           Tensor([0, sin(θ)])
                       ])
                julia> ∂ = PartialDerivative((θ, φ))
                julia> Γ = christoffel((θ, φ), e)
                julia> ∇ = CovariantDerivative(Γ, ∂)
                julia> X = Tensor([sin(θ), 0])
                julia> ∇[:i] * X[:j]
                (1, 1)-Tensor:
                Num[cos(θ) 0; 0 cos(θ)]
                    (:contra, :co)
                    (:j,), (:i,)
                

Curvature

There are a few metrics for curvature. The most explicit is the Riemann curvature tensor, which is given by riemann(). It's a (1, 3)-tensor that measures how parallel transport rotates a vector. In it's unsimplified form it can be a mess, so TensorFlux adapts Symbolics.simplify to act in each tensor component.

julia

                julia> @variables θ φ
                julia> e = Basis([
                           Tensor([1, 0]),
                           Tensor([0, sin(θ)])
                       ])
                julia> R = riemann((θ, φ), e)
                julia> simplify(R)
                (1, 3)-Tensor:
                Num[0.0 0.0; 0.0 -1.0;;; 0.0 sin(θ)^2; 0 0;;;; 0.0 0; 1.0 0;;; -(sin(θ)^2) 0; 0 0]
                    (:contra, :co, :co, :co)
                

Taking the trace of the Riemann curvature tensor results in the Ricci curvature tensor. Similarly, it's returned by ricci(), and is a (0, 2)-tensor.

julia

                julia> Ric = ricci((θ, φ), e)
                julia> simplify(Ric)
                (0, 2)-Tensor:
                Num[1.0 0.0; 0.0 sin(θ)^2]
                    (:co, :co)
                

Raising an index of the Ricci curvature tensor with the inverse metric and taking the trace again yields the Ricci curvature scalar. Again, it's returned by ricci_scalar().

julia

                julia> Ric_scalar = ricci_scalar((θ, φ), e)
                julia> simplify(Ric_scalar)
                2.0
                

A combination of the Ricci tensor and Ricci scalar are used to make the Einstein tensor. The Einstein tensor is special in that it has zero divergence everywhere, where divergence is found by taking the trace with the covariant derivative. Once again, this operation is simplified by einstein().

julia

                julia> G = einstein((θ, φ), e)
                julia> simplify(G)
                (0, 2)-Tensor:
                Num[0.0 0.0; 0.0 0]
                    (:co, :co)
                

On a sphere it turns out to not be interesting.

Lie Bracket

The Lie bracket measures the failure of vector fields to commute. Starting from a point, it measures the difference between flowing along X and then Y, and flowing along Y and then X. For convenience, lie() calculates this difference.

julia

                julia> @variables u v
                julia> ∂ = PartialDerivative((u, v))
                julia> X = Tensor([u, v])
                julia> Y = Tensor([u^2, -v + 2])
                julia> lie(X, Y, ∂)
                (1, 0)-Tensor:
                Num[u^2, -2]
                    (:contra,)