Layers Module
Layers are the building blocks of neural networks. They store learnable parameters (weights and biases).
NN.Linear
Applies a linear transformation to the incoming data: y = x * W^T + b.
lua
(
inFeatures: number,
outFeatures: number,
initializer: ((fanIn, fanOut) -> number)?
) -> Modulelua
local layer = Gradien.NN.Linear(128, 64)NN.Sequential
A container that chains modules together. Data flows through them in the order they are defined.
lua
(layers: { Module | (Tensor)->Tensor }) -> Modulelua
local model = Gradien.NN.Sequential({
Gradien.NN.Linear(10, 20),
Gradien.NN.Linear(20, 5)
})NN.Conv2d
Applies a 2D convolution over an input signal composed of several input planes.
lua
(C_in: number, C_out: number, KH: number, KW: number) -> Modulelua
local conv = Gradien.NN.Conv2d(3, 64, 3, 3) -- 3 input channels, 64 output, 3x3 kernelNN.MaxPool2d
Applies a 2D max pooling over an input signal.
lua
(KH: number, KW: number, stride: number) -> Modulelua
local pool = Gradien.NN.MaxPool2d(2, 2, 2) -- 2x2 kernel, stride 2NN.AvgPool2d
Applies a 2D average pooling over an input signal.
lua
(KH: number, KW: number, stride: number) -> Modulelua
local pool = Gradien.NN.AvgPool2d(2, 2, 2) -- 2x2 kernel, stride 2NN.ConvTranspose2d
Applies a 2D transposed convolution operator over an input image composed of several input planes.
lua
(C_in: number, C_out: number, KH: number, KW: number) -> Modulelua
local convT = Gradien.NN.ConvTranspose2d(64, 3, 3, 3) -- 64 input channels, 3 output, 3x3 kernelNN.GroupNorm
Applies Group Normalization over a mini-batch of inputs. Divides channels into groups and normalizes within each group independently.
lua
(num_groups: number, num_channels: number, eps: number?) -> Modulelua
local gn = Gradien.NN.GroupNorm(8, 64) -- 8 groups, 64 channelsParameters:
num_groups(number): Number of groups to divide channels into. Must dividenum_channelsevenly.num_channels(number): Number of channels expected in the input.eps(number, optional): Small value added to variance for numerical stability. Default:1e-5.