Qore DataProvider Module Reference  1.0.2
DataProvider::DataProviderPipeline Class Reference

Defines a class for passing data through record processors. More...

Public Member Functions

 abort (*bool ignore_exceptions)
 Aborts execution of a pipeline in progress. More...
 
 append (AbstractDataProcessor processor)
 Appends a data processor to the default pipeline. More...
 
 append (int id, AbstractDataProcessor processor)
 Appends a data processor to a pipeline. More...
 
int appendQueue (int id)
 Appends a new queue to an existing pipeline and returns the new queue ID. More...
 
Counter cnt ()
 Thread counter.
 
 constructor (*hash< PipelineOptionInfo > opts)
 Creates the object with the given input iterator. More...
 
 destructor ()
 Destroys the object. More...
 
hash< PipelineInfogetInfo ()
 Returns pipeline info. More...
 
string getName ()
 Returns the pipeline name.
 
bool isProcessing ()
 Returns True if the pipeline is processing data.
 
Mutex lck ()
 Atomic lock.
 
 logDebug (string fmt)
 Logs to the debug log, if set.
 
 logError (string fmt)
 Logs to the error log, if set.
 
 logInfo (string fmt)
 Logs to the info log, if set.
 
 reportError (PipelineQueue queue, hash< ExceptionInfo > ex)
 Called from a pipeline queue object to report a fatal error durring processing.
 
 reset ()
 Resets the pipeline. More...
 
 submit (AbstractDataProviderBulkRecordInterface i)
 Submits data for processing.
 
 submit (AbstractDataProviderRecordIterator i)
 Submits data for processing.
 
 submit (auto _data)
 Submits data for processing.
 
 submitData (AbstractIterator i)
 Submits data for processing.
 
 waitDone ()
 Waits for all queues in all pipelines to have processed remaining data. More...
 

Public Attributes

bool abort_flag
 Abort flag.
 
bool stop_flag
 Stop flag.
 

Protected Member Functions

 checkLockedIntern ()
 Throws an exception if the pipeline is locked. More...
 
 checkSubmitIntern ()
 Throws an exception if the pipeline cannot be used; locks the pipeline for changes otherwise. More...
 
 checkUpdatePipelineIntern (int id)
 Check if the given queue exists.
 
 resetIntern ()
 Resets the pipeline. More...
 
Sequence seq (1)
 Pipeline ID sequence generator.
 
 stopIntern ()
 Stops all background pipeline queues.
 
 stopInternUnlocked ()
 Stops all background pipeline queues; lock must be held.
 
 submitBulkIntern (AbstractDataProviderBulkRecordInterface i)
 Submits bulk data for processing. More...
 
 submitDataIntern (auto _data)
 Submits a single record for processing. More...
 
 submitIntern (auto _data)
 Submits data for processing. More...
 
 throwPipelineException ()
 Throws an exception if errors occured in background pipeline processing. More...
 

Protected Attributes

*code debug_log
 Debug log closure; takes a single format string and then arguments for format placeholders.
 
bool do_bulk = True
 Bulk flag.
 
list< hash< ExceptionInfo > > error_list
 list of exceptions in pipelines
 
*code error_log
 Error log closure; takes a single format string and then arguments for format placeholders.
 
*code info_log
 Info log closure; takes a single format string and then arguments for format placeholders.
 
bool locked = False
 Locked flag.
 
string name
 A descriptive name for logging purposes.
 
hash< string, PipelineQueuepmap
 Hash of queues keyed by queue ID.
 
int record_count = 0
 Record count.
 
date start_time
 run start time
 
date stop_time
 run stop time (set in waitDone())
 

Detailed Description

Defines a class for passing data through record processors.

Record processing pipelines run in background threads. Each pipeline has an integer pipeline ID; a pipeline with ID 0 is created by default as the initial pipeline.

Member Function Documentation

◆ abort()

DataProvider::DataProviderPipeline::abort ( *bool  ignore_exceptions)

Aborts execution of a pipeline in progress.

Parameters
ignore_exceptionsif True then any processing exceptions are ignored
Exceptions
PIPELINE-FAILEDthrown if ignore_exceptions is not True and there are any errors in pipeline processing, in which case the exception argument will have a list of exception arguments thrown by pipeline threads
Note
The pipeline must be reset once aborted to be used again

◆ append() [1/2]

DataProvider::DataProviderPipeline::append ( AbstractDataProcessor  processor)

Appends a data processor to the default pipeline.

Note
The initial queue is queue 0

◆ append() [2/2]

DataProvider::DataProviderPipeline::append ( int  id,
AbstractDataProcessor  processor 
)

Appends a data processor to a pipeline.

Parameters
idthe queue ID as returned from appendQueue()
processorthe data processor to append to the pipeline
Note
The initial queue is queue 0
See also
appendPipeline()

◆ appendQueue()

int DataProvider::DataProviderPipeline::appendQueue ( int  id)

Appends a new queue to an existing pipeline and returns the new queue ID.

Parameters
idthe queue to which the new pipeline will be appended
Returns
the new queue ID
Exceptions
PIPELINE-ERRORthe pipeline is locked, or the given queue does not exist
Note
The initial queue is queue 0
See also
append(int, AbstractDataProcessor)

◆ checkLockedIntern()

DataProvider::DataProviderPipeline::checkLockedIntern ( )
protected

Throws an exception if the pipeline is locked.

Must be called with the lock held

◆ checkSubmitIntern()

DataProvider::DataProviderPipeline::checkSubmitIntern ( )
protected

Throws an exception if the pipeline cannot be used; locks the pipeline for changes otherwise.

Must be called with the lock held

◆ constructor()

DataProvider::DataProviderPipeline::constructor ( *hash< PipelineOptionInfo opts)

Creates the object with the given input iterator.

Parameters
optsany options for the pipeline; see PipelineOptionInfo for more information

@Note The object is created with an initial queue with ID 0

◆ destructor()

DataProvider::DataProviderPipeline::destructor ( )

Destroys the object.

To ensure that the destructor does not throw a PIPELINE-FAILED exception, call run() or runAsync() and waitDone()

Exceptions
PIPELINE-FAILEDthrown if there are any errors in pipeline processing, in which case the exception argument will have a list of exception arguments thrown by pipeline threads

◆ getInfo()

hash<PipelineInfo> DataProvider::DataProviderPipeline::getInfo ( )

Returns pipeline info.

Returns
pipeline info; see PipelineInfo for more information
Note
record count and performance intormation is only valid after the pipeline has completed processing

◆ reset()

DataProvider::DataProviderPipeline::reset ( )

Resets the pipeline.

Exceptions
PIPELINE-ERRORthis method cannot be called while the pipeline is processing data; call abort() or waitDone() before resetting if the pipeline is processing data

◆ resetIntern()

DataProvider::DataProviderPipeline::resetIntern ( )
protected

Resets the pipeline.

Must be called with the lock held

◆ submitBulkIntern()

DataProvider::DataProviderPipeline::submitBulkIntern ( AbstractDataProviderBulkRecordInterface  i)
protected

Submits bulk data for processing.

Must be called with the lock held

◆ submitDataIntern()

DataProvider::DataProviderPipeline::submitDataIntern ( auto  _data)
protected

Submits a single record for processing.

Must be called with the lock held

◆ submitIntern()

DataProvider::DataProviderPipeline::submitIntern ( auto  _data)
protected

Submits data for processing.

Must be called with the lock held

◆ throwPipelineException()

DataProvider::DataProviderPipeline::throwPipelineException ( )
protected

Throws an exception if errors occured in background pipeline processing.

Must be called with the lock held

◆ waitDone()

DataProvider::DataProviderPipeline::waitDone ( )

Waits for all queues in all pipelines to have processed remaining data.

Exceptions
PIPELINE-FAILEDthrown if there are any errors in pipeline processing, in which case the exception argument will have a list of exception arguments thrown by pipeline threads