Caffe2 - Python API
A deep learning, cross platform ML framework
Public Member Functions | Static Public Member Functions | List of all members
optimizer.Optimizer Class Reference
Inheritance diagram for optimizer.Optimizer:
optimizer.AdagradOptimizer optimizer.AdamOptimizer optimizer.FtrlOptimizer optimizer.SgdOptimizer

Public Member Functions

def __init__ (self)
 
def __call__ (self, net, param_init_net, param, grad)
 
def get_auxiliary_parameters (self)
 

Static Public Member Functions

def build_lr (net, param_init_net, base_learning_rate, learning_rate_blob="lr", policy="fixed", iter_val=0, kwargs)
 
def dedup (net, sparse_dedup_aggregator, grad)
 

Detailed Description

Definition at line 14 of file optimizer.py.

Member Function Documentation

◆ get_auxiliary_parameters()

def optimizer.Optimizer.get_auxiliary_parameters (   self)
Returns a list of auxiliary parameters.

Returns:
    aux_params: A namedtuple, AuxParams.

    aux_params.local stores a list of blobs. Each blob is a local
    auxiliary parameter. A local auxiliary parameter is a parameter in
    parallel to a learning rate parameter. Take adagrad as an example,
    the local auxiliary parameter is the squared sum parameter, because
    every learning rate has a squared sum associated with it.

    aux_params.shared also stores a list of blobs. Each blob is a shared
    auxiliary parameter. A shared auxiliary parameter is a parameter
    that is shared across all the learning rate parameters. Take adam as
    an example, the iteration parameter is a shared parameter, because
    all the learning rates share the same iteration parameter.

Definition at line 59 of file optimizer.py.


The documentation for this class was generated from the following file: