function - defines theano.function

Guide

This module provides function(), commonly accessed as theano.function, the interface for compiling graphs into callable objects.

You’ve already seen example usage in the basic tutorial... something like this:

>>> import theano
>>> x = theano.tensor.dscalar()
>>> f = theano.function([x], 2*x)
>>> f(4)
array(8.0)

The idea here is that we’ve compiled the symbolic graph (2*x) into a function that can be called on a number and will do some computations.

The behaviour of function can be controlled in several ways, such as Param, mode, updates, and givens. These are covered in the tutorial examples and tutorial on modes.

Reference

class function.Out

A class for attaching information to function outputs

variable

A variable in an expression graph to use as a compiled-function output

borrow

True indicates that a reference to internal storage may be returned, and that the caller is aware that subsequent function evaluations might overwrite this memory.

__init__(variable, borrow=False)

Initialize attributes from arguments.

class function.Param

A class for attaching information to function inputs.

variable

A variable in an expression graph to use as a compiled-function parameter

default

The default value to use at call-time (can also be a Container where the function will find a value at call-time.)

name

A string to identify an argument for this parameter in keyword arguments.

mutable

True means the compiled-function is allowed to modify this argument. False means it is not allowed.

strict

If False, a function argument may be copied or cast to match the type required by the parameter variable. If True, a function argument must exactly match the type required by variable.

__init__(self, variable, default=None, name=None, mutable=False, strict=False)

Initialize object attributes.

function.function(inputs, outputs, mode=None, updates=None, givens=None, no_default_updates=False, accept_inplace=False, name=None, rebuild_strict=True, allow_input_downcast=None, profile=None, on_unused_input='raise')

Return a callable object that will calculate outputs from inputs.

Parameters:
  • params (list of either Variable or Param instances, but not shared variables.) – the returned Function instance will have parameters for these variables.
  • outputs (list of Variables or Out instances) – expressions to compute.
  • mode (None, string or Mode instance.) – compilation mode
  • updates (iterable over pairs (shared_variable, new_expression). List, tuple or dict.) – expressions for new SharedVariable values
  • givens (iterable over pairs (Var1, Var2) of Variables. List, tuple or dict. The Var1 and Var2 in each pair must have the same Type.) – specific substitutions to make in the computation graph (Var2 replaces Var1).
  • no_default_updates (either bool or list of Variables) – if True, do not perform any automatic update on Variables. If False (default), perform them all. Else, perform automatic updates on all Variables that are neither in updates nor in no_default_updates.
  • name – an optional name for this function. The profile mode will print the time spent in this function.
  • rebuild_strict – True (Default) is the safer and better tested setting, in which case givens must substitute new variables with the same Type as the variables they replace. False is a you-better-know-what-you-are-doing setting, that permits givens to replace variables with new variables of any Type. The consequence of changing a Type is that all results depending on that variable may have a different Type too (the graph is rebuilt from inputs to outputs). If one of the new types does not make sense for one of the Ops in the graph, an Exception will be raised.
  • allow_input_downcast (Boolean or None) – True means that the values passed as inputs when calling the function can be silently downcasted to fit the dtype of the corresponding Variable, which may lose precision. False means that it will only be cast to a more general, or precise, type. None (default) is almost like False, but allows downcasting of Python float scalars to floatX.
  • profile (None, True, or ProfileStats instance) – accumulate profiling information into a given ProfileStats instance. If argument is True then a new ProfileStats instance will be used. This profiling object will be available via self.profile.
  • on_unused_input – What to do if a variable in the ‘inputs’ list is not used in the graph. Possible values are ‘raise’, ‘warn’, and ‘ignore’.
Return type:

Function instance

Returns:

a callable object that will compute the outputs (given the inputs) and update the implicit function arguments according to the updates.

Inputs can be given as variables or Param instances. Param instances also have a variable, but they attach some extra information about how call-time arguments corresponding to that variable should be used. Similarly, Out instances can attach information about how output variables should be returned.

The default is typically ‘FAST_RUN’ but this can be changed in theano.config. The mode argument controls the sort of optimizations that will be applied to the graph, and the way the optimized graph will be evaluated.

After each function evaluation, the updates mechanism can replace the value of any SharedVariable [implicit] inputs with new values computed from the expressions in the updates list. An exception will be raised if you give two update expressions for the same SharedVariable input (that doesn’t make sense).

If a SharedVariable is not given an update expression, but has a default_update member containing an expression, this expression will be used as the update expression for this variable. Passing no_default_updates=True to function disables this behavior entirely, passing no_default_updates=[sharedvar1, sharedvar2] disables it for the mentioned variables.

Regarding givens: Be careful to make sure that these substitutions are independent, because behaviour when Var1 of one pair appears in the graph leading to Var2 in another expression is undefined (e.g. with {a: x, b: a + 1}). Replacements specified with givens are different from optimizations in that Var2 is not expected to be equivalent to Var1.

theano.compile.function.function_dump(filename, inputs, outputs=None, mode=None, updates=None, givens=None, no_default_updates=False, accept_inplace=False, name=None, rebuild_strict=True, allow_input_downcast=None, profile=None, on_unused_input=None)

This is helpful to make a reproducable case for problem during Theano compilation.

Ex:

replace theano.function(...) by theano.function_dump(‘filename.pkl’, ...).

If you see this, you where probably asked to use this function to help debug a particular case during the compilation of a Theano function. function_dump allows to easily reproduce your compilation without asking any code. It pickle all the objects and parameters needed to reproduce a call to theano.function(). This include shared variables and there values. If you do not want that, you can set to replace shared variables values by zeros by calling set_value(...) on them before calling function_dump.

To load such a dump and do the compilation:

>>> import cPickle, theano
>>> d = cPickle.load(open("func_dump.bin", "rb"))  
>>> f = theano.function(**d)  
class theano.compile.function_module.Function(fn, input_storage, output_storage, indices, outputs, defaults, unpack_single, return_none, output_keys, maker)

Type of the functions returned by theano.function or theano.FunctionMaker.create.

Function is the callable object that does computation. It has the storage of inputs and outputs, performs the packing and unpacking of inputs and return values. It implements the square-bracket indexing so that you can look up the value of a symbolic node.

Functions are copyable via {{{fn.copy()}}} and {{{copy.copy(fn)}}}. When a function is copied, this instance is duplicated. Contrast with self.maker (instance of FunctionMaker) that is shared between copies. The meaning of copying a function is that the containers and their current values will all be duplicated. This requires that mutable inputs be copied, whereas immutable inputs may be shared between copies.

A Function instance is hashable, on the basis of its memory address (its id).

A Function instance is only equal to itself.

A Function instance may be serialized using the pickle or cPickle modules. This will save all default inputs, the graph, and *** to the pickle file (WRITEME).

A Function instance have a trust_input field that default to False. When True, we don’t do extra check of the input to give better error message. In some case, python code will still return the good results if you pass a python or numpy scalar instead of a numpy tensor. C code should raise an error if you pass an object of the wrong type.

finder
inv_finder
copy(share_memory=False, swap=None, delete_updates=False, name=None, profile=None)

Copy this function. Copied function will have separated maker and fgraph with original function. User can choose whether to separate storage by changing the share_memory arguments. ——————— Params:

share_memory – { boolean } Default is False. When True, two function share intermediate storages(storages except input and output storages). Otherwise two functions will only share partial storages and same maker. If two functions share memory and allow_gc=False, this will increase executing speed and save memory.

swap – { dict } Dictionary that map old SharedVariables to new SharedVariables. Default is None. NOTE: The shared variable swap in only done in the new returned function, not in the user graph.

delete_updates – { boolean } Default is False. If True, Copied function will not have update.

name – { string } If provided, will be the name of the new Function. Otherwise, it will be old + ” copy”

profile – as theano.function profile parameter

Returns:
func – Copied theano.Function
free()

When allow_gc = False, clear the Variables in storage_map