Optim Setup

NonArchimedeanMachineLearning.LossType
Loss{F1,F2}

A batch-oriented loss function structure for optimization.

Wraps both an evaluation function and a gradient function. Both functions should be closures that capture any necessary data (for example training data) and operate on batches.

Fields

  • eval::F1: Function with signature (params) -> values, where params is a collection of parameter polydiscs and values is the corresponding collection of loss values
  • grad::F2: Function with signature (tangents) -> values, where tangents is a collection of tangent vectors and values is the corresponding collection of directional derivatives
source
NonArchimedeanMachineLearning.OptimSetupType
OptimSetup{S,T,N,U,V,L,O}

Complete optimization setup containing loss, parameters, optimizer, and state.

Mutable structure that captures everything needed for optimization. The loss function should have data baked in as a closure.

Fields

  • loss::L: Loss function (closure over data) with a batch evaluation method and a batch directional-derivative method
  • param::ValuationPolydisc{S,T,N}: Current parameter values (mutable during optimization)
  • optimiser::O: Optimizer function (loss, param, state, context) -> (new_param, new_state, converged)
  • state::U: Optimization state (e.g., previous steps, momentum, etc.)
  • context::V: Optimizer settings (e.g., learning rate, degree, etc.)
  • converged::Bool: Whether the optimizer has converged

Type Parameters

  • S: Coefficient type (typically p-adic numbers)
  • T: Radius/valuation type
  • N: Dimension of parameter space
  • U: State type
  • V: Context type
  • L: Concrete loss type
  • O: Concrete optimizer callable type
source
NonArchimedeanMachineLearning.eval_lossMethod
eval_loss(optim::OptimSetup)

Evaluate the loss function at the current parameter values.

Arguments

  • optim::OptimSetup: The optimization setup

Returns

Scalar value of the loss at the current parameters

source
NonArchimedeanMachineLearning.has_convergedMethod
has_converged(optim::OptimSetup) -> Bool

Check whether the optimization has converged.

Convergence is detected when the optimizer can no longer refine parameters, typically because the polydisc radius has reached the precision of the p-adic field.

Arguments

  • optim::OptimSetup: The optimization setup

Returns

true if the optimization has converged, false otherwise.

source
NonArchimedeanMachineLearning.optimize!Method
optimize!(optim::OptimSetup, max_steps::Int; verbose::Bool=false) -> Int

Run optimization until convergence or max_steps, whichever comes first.

Arguments

  • optim::OptimSetup: The optimization setup
  • max_steps::Int: Maximum number of steps to take
  • verbose::Bool=false: If true, print loss at each step

Returns

The number of steps taken. Check has_converged(optim) to distinguish early convergence from hitting max_steps.

Example

optim = greedy_descent_init(param, loss, 1, (false, 1))
steps = optimize!(optim, 100; verbose=true)
if has_converged(optim)
    println("Converged after \$steps steps")
else
    println("Reached max steps (\$steps)")
end
source
NonArchimedeanMachineLearning.step!Method
step!(optim_setup::OptimSetup)

Perform one optimization step.

Calls the optimizer function to compute new parameters, state, and convergence status, then updates the optimization setup accordingly.

Arguments

  • optim_setup::OptimSetup: The optimization setup

Notes

Mutates the optimization setup by updating both parameters and state.

source
NonArchimedeanMachineLearning.update_param!Method
update_param!(optim::OptimSetup{S,T,N,U,V,L,O}, param::ValuationPolydisc{S,T,N}) where {S,T,N,U,V,L,O}

Update the parameter values in the optimization setup.

Arguments

  • optim::OptimSetup{S,T,N,U,V,L,O}: The optimization setup
  • param::ValuationPolydisc{S,T,N}: New parameter values

Notes

Mutates the optimization setup in place.

source
NonArchimedeanMachineLearning.update_state!Method
update_state!(optim::OptimSetup{S,T,N,U,V,L,O}, state::U) where {S,T,N,U,V,L,O}

Update the optimizer state in the optimization setup.

Arguments

  • optim::OptimSetup{S,T,N,U,V,L,O}: The optimization setup
  • state::U: New state value

Notes

Mutates the optimization setup in place.

source