Lux.jl

The Lux.jl package allows for defining neural network architectures for scientific machine learning (SciML) and other deep learning applications. The InformationGeometry Lux extension provides simplified constructors for creating dense neural network components with different input-output sizes. In particular, NormalizedNeuralModel conveniently wraps the resulting neural network model in a ModelMap, including the options for pre- and post-transforms of the respective inputs and outputs to the neural net.

InformationGeometry.NeuralNetFunction
NeuralNet(DS::AbstractDataSet, N::Int=2, hidden::Int=1; kwargs...)
NeuralNet(In::Int, Out::Int, N::Int=2, hidden::Int=1; ForcePositiveOutputs::Bool=false, HiddenActivation::Function=tanh, 
                        FinalActivation::Function=(ForcePositiveOutputs ? softplus : identity), gain::Real=1, kwargs...)

N is number of neurons in intermediate layers, hidden is number of hidden layers, sandwiched between the input and output layers, returning a Lux.Chain. If hidden is -1, the dedicated output layer is dropped, returning a single Lux.Dense layer.

source
InformationGeometry.NormalizedNeuralModelFunction
NormalizedNeuralModel(xd::Int, yd::Int, N::Int=2, hidden::Int=1; kwargs...)
NormalizedNeuralModel(DS::AbstractDataSet, N::Int=2, hidden::Int=1; rng=Random.default_rng(), PreTransform::Function=x->(x .- Xmean) ./ Xdiv, 
                PostTransform::Function=y->(Ydiv .* y) .+ Ymean, kwargs...)

Returns Tuple (M, P, U) where M is a ModelMap of the neural net with given input dimensions including a normalization of inputs and outputs. P is a random initial ComponentVector parameter configuration and U is the Lux.Chain object of the underlying neural net. Other kwargs are forwarded to the NeuralNet method.

source