Caffe2 - Python API
A deep learning, cross platform ML framework
Public Member Functions | List of all members
dataio.Reader Class Reference
Inheritance diagram for dataio.Reader:
dataio.CounterReader dataio.ReaderWithLimit queue_util._QueueReader

Public Member Functions

def __init__ (self, schema=None)
 
def schema (self)
 
def setup_ex (self, init_net, finish_net)
 
def read_ex (self, local_init_net, local_finish_net)
 
def read_record_ex (self, local_init_net, local_finish_net)
 
def read (self, read_net)
 
def reset (self, net)
 
def read_record (self, read_net)
 
def execution_step (self, reader_net_name=None, external_should_stop=None)
 

Detailed Description

Definition at line 27 of file dataio.py.

Member Function Documentation

◆ execution_step()

def dataio.Reader.execution_step (   self,
  reader_net_name = None,
  external_should_stop = None 
)
Create an execution step with a net containing read operators.

The execution step will contain a `stop_blob` that knows how to stop
the execution loop when end of data was reached.

E.g.:

    read_step, fields = reader.execution_step()
    consume_net = core.Net('consume')
    consume_net.Print(fields[0], [])
    p = core.Plan('reader')
    p.AddStep(read_step.AddNet(consume_net))
    core.RunPlan(p)

Args:

    reader_net_name: (optional) the name of the reader_net to be
             created. The execution step will
             be named accordingly.

Returns:
    A tuple (read_step, fields), with:

read_step: A newly created execution step containing a net with
           read operations. The step will have `stop_blob` set,
           in order to stop the loop on end of data.
fields: A tuple of BlobReference containing the latest batch
        of data that was read.

Definition at line 109 of file dataio.py.

◆ read()

def dataio.Reader.read (   self,
  read_net 
)
Add operations to read_net that will read the read batch of data
and return a list of BlobReference representing the blobs that will
contain the batches produced.

Operations added to `read_net` must be thread safe and atomic, that is,
it should be possible to clone `read_net` and run multiple instances of
it in parallel.

Args:
    read_net: the net that will be appended with read operations

Returns:
    A tuple (should_stop, fields), with:

should_stop: BlobReference pointing to a boolean scalar
             blob that indicates whether the read operation
             was succesfull or whether the end of data has
             been reached.
fields: A tuple of BlobReference containing the latest batch
        of data that was read.

Definition at line 70 of file dataio.py.

◆ read_ex()

def dataio.Reader.read_ex (   self,
  local_init_net,
  local_finish_net 
)
Experimental extension to the interface. Don't use yet

Definition at line 48 of file dataio.py.

◆ read_record_ex()

def dataio.Reader.read_record_ex (   self,
  local_init_net,
  local_finish_net 
)
Experimental extension to the interface. Don't use yet

Definition at line 53 of file dataio.py.

◆ reset()

def dataio.Reader.reset (   self,
  net 
)
Append operations to `net` that will reset the reader.

This can be used to read the data multiple times.
Not all readers support this operation.

Definition at line 95 of file dataio.py.

◆ schema()

def dataio.Reader.schema (   self)
Return the schema associated with the Reader

Definition at line 33 of file dataio.py.

◆ setup_ex()

def dataio.Reader.setup_ex (   self,
  init_net,
  finish_net 
)
Nets to be executed once at startup and finish.
   Experimental extension. Don't use yet

Definition at line 43 of file dataio.py.


The documentation for this class was generated from the following file: