Caffe2 - Python API
A deep learning, cross platform ML framework
Public Member Functions | Public Attributes | List of all members
model_helper.ModelHelperBase Class Reference
Inheritance diagram for model_helper.ModelHelperBase:
layer_model_helper.LayerModelHelper

Public Member Functions

def __init__ (self, name=None, init_params=True, allow_not_known_ops=True, skip_sparse_optim=False, param_model=None)
 
def get_name (self)
 
def add_param (self, param, key=None, shape=None, length=None)
 
def param_info (self, grad_type=None, id=None)
 
def GetParams (self, namescope=None, top_scope=False)
 
def Proto (self)
 
def InitProto (self)
 
def RunAllOnGPU (self, args, kwargs)
 
def CreateDB (self, blob_out, db, db_type, kwargs)
 
def AddGradientOperators (self, args, kwargs)
 
def get_param_to_grad (self, params)
 
def GetOptimizationPairs (self, params=None)
 
def GetComputedParams (self, namescope=None)
 
def GetAllParams (self, namescope=None)
 
def TensorProtosDBInput (self, unused_blob_in, blob_out, batch_size, db, db_type, kwargs)
 
def AddOperator (self, op_type, inputs, parameters, args, kwargs)
 
def GetDevices (self)
 
def __getattr__ (self, op_type)
 

Public Attributes

 name
 
 net
 
 param_init_net
 
 param_to_grad
 
 params
 
 computed_params
 
 gradient_ops_added
 
 init_params
 
 allow_not_known_ops
 
 skip_sparse_optim
 
 weights
 
 biases
 
 grad_map
 

Detailed Description

A helper model so we can write models more easily, without having to
manually define parameter initializations and operators separately.
In order to add support for specific operators, inherit from this class
and add corresponding methods. Operator representing methods should
take care of adding their parameters to params

Definition at line 54 of file model_helper.py.

Member Function Documentation

◆ __getattr__()

def model_helper.ModelHelperBase.__getattr__ (   self,
  op_type 
)
Catch-all for all other operators, mostly those without params.

Definition at line 284 of file model_helper.py.

◆ AddOperator()

def model_helper.ModelHelperBase.AddOperator (   self,
  op_type,
  inputs,
  parameters,
  args,
  kwargs 
)
Adds an operator to a model. Use parameters list
to specify which operator inputs are model parameters to be
optimized.

Example of usage:

model.SparseLengthsSum(
     [embedding, indices, lengths],
     parameters=[embedding],
)

Here embedding is a parameter to be optimized while indices
and lengths are not.

Definition at line 253 of file model_helper.py.

◆ get_param_to_grad()

def model_helper.ModelHelperBase.get_param_to_grad (   self,
  params 
)
Given a list of parameters returns a dict from a parameter
to a corresponding gradient

Definition at line 186 of file model_helper.py.

◆ GetComputedParams()

def model_helper.ModelHelperBase.GetComputedParams (   self,
  namescope = None 
)
Returns the computed params in current namescope. 'Computed params'
are such parameters that are not optimized via gradient descent but are
directly computed from data, such as the running mean and variance
of Spatial Batch Normalization.

Definition at line 220 of file model_helper.py.

◆ GetOptimizationPairs()

def model_helper.ModelHelperBase.GetOptimizationPairs (   self,
  params = None 
)
Returns a map for param => grad.
If params is not specified, all parameters will be considered.

Definition at line 202 of file model_helper.py.

◆ GetParams()

def model_helper.ModelHelperBase.GetParams (   self,
  namescope = None,
  top_scope = False 
)
Returns the params in current namescope

Definition at line 142 of file model_helper.py.

◆ TensorProtosDBInput()

def model_helper.ModelHelperBase.TensorProtosDBInput (   self,
  unused_blob_in,
  blob_out,
  batch_size,
  db,
  db_type,
  kwargs 
)
TensorProtosDBInput.

Definition at line 244 of file model_helper.py.


The documentation for this class was generated from the following file: