Skip to content

Tensor Core

The Tensor is the fundamental data structure in Gradien. It represents a multi-dimensional array and supports automatic differentiation.

Constructors

.zeros Parallel

Creates a new tensor filled with zeros.

lua
(shape: {number}, dtype: "f32"|"f64"|"i32"?, requiresGrad: boolean?) -> Tensor

.ones Parallel

Creates a new tensor filled with ones.

lua
(shape: {number}, dtype: "f32"|"f64"|"i32"?, requiresGrad: boolean?) -> Tensor

.fromArray

Creates a tensor from a flat table.

lua
(data: {number}, shape: {number}, dtype: "f32"|"f64"?, requiresGrad: boolean?) -> Tensor

.randn Parallel

Creates a tensor with random numbers from a normal distribution (mean 0, std 1).

lua
(shape: {number}, dtype: "f32"|"f64"?, requiresGrad: boolean?) -> Tensor

.empty

Creates an uninitialized tensor (allocated but not zeroed).

lua
(shape: {number}, dtype: "f32"|"f64"?, requiresGrad: boolean?) -> Tensor

Methods

:reshape

Returns a new tensor with the same data but a different shape.

lua
(self: Tensor, newShape: {number}) -> Tensor

:expand Parallel

Returns a new view of the tensor with singleton dimensions expanded to a larger size.

lua
(self: Tensor, newShape: {number}) -> Tensor

:transpose Parallel

Permutes two dimensions of the tensor.

lua
(self: Tensor, dim1: number?, dim2: number?) -> Tensor

:slice Parallel

Extracts a sub-tensor from the given dimension.

lua
(self: Tensor, dim: number, startIdx: number, endIdx: number?, step: number?) -> Tensor

:narrow

Returns a new tensor that is a narrowed version of the input tensor along dimension dim.

lua
(self: Tensor, dim: number, startIdx: number, length: number) -> Tensor

:sum Parallel

Returns the sum of all elements in the input tensor.

lua
(self: Tensor, dim: number?) -> Tensor

:contiguous

Returns a contiguous in memory tensor containing the same data as self tensor.

lua
(self: Tensor) -> Tensor

:is_contiguous

Returns True if the tensor is contiguous in memory in C order.

lua
(self: Tensor) -> boolean

:detach

Returns a new Tensor, detached from the current graph. The result will never require gradient.

lua
(self: Tensor) -> Tensor

:noGrad

Disables gradient recording for this specific tensor instance.

lua
(self: Tensor) -> ()