Skip to content

Autograd (Tape) Core

The Tape module records operations and manages the backward pass.

Functions

.backwardFrom Parallel

Computes the gradient of current tensor w.r.t. graph leaves.

lua
(y: Tensor, grad: Tensor) -> ()
lua
Tape.backwardFrom(loss, Tensor.ones(loss._shape))

.grad

Computes and returns the sum of gradients of outputs with respect to the inputs.

lua
(f: (...any) -> Tensor, inputs: {Tensor}) -> {Tensor?}

.noGrad

Disables gradient tracking for the duration of the function fn. Useful for inference.

lua
(fn: () -> ()) -> ()
lua
Tape.noGrad(function()
    -- Operations here won't be recorded
    local pred = model:forward(input)
end)

.makeNode

Internal function used to register a new operation in the computation graph.

lua
(out: Tensor, parents: {Tensor}, back: (grad: Tensor) -> ()) -> Tensor