The FilterPype API documentation.
Bases: filterpype.data_fltr_base.DataFilterBase
Basic framework for data filter coroutine, sending packets on after receiving them. Override filter_data() for practical functionality. Packet received may be a data packet or a message bottle.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: object
This is the Filter part of the Pipes and Filters design pattern, using Python co-routines to push the data from one filter to the next.
Each processing step is encapsulated in a filter component. Pipes are the means by which data is passed from one filter to the next, implemented by Unix pipes or FIFO queues. However, using co-routines in Python means that, in simple cases, there is no longer a need for the pipe to buffer data between filters. Each filter will just wait until more data arrives in its yield inbox.
Note that “filter” is a Python reserved word, so use “data_filter” or some other variation instead.
The DataFilter has the factory that made it passed in, so if there are missing settings in the validation, it can try to get them from its factory.
Filters are named with a verb for the transformation, rather than a noun, where possible. So we say CountLoops rather than LoopCounter, ReadBatch rather than FileReader, and Peek rather than Peeker. Then we can talk about the ReadBatch DataFilter.
Hook to execute just after a packet is sent on
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: object
The data is passed through the filters in packets. These are Python dictionaries that provide the means to store parameters or partial results, along with the data which is a string.
Data can be passed to the packet constructor as a string parameter, or as data=’xxx’ keyword parameter.
TO-DO: Test cloning for efficiency <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
When packets are split up, the parameters need to be passed along, with the revised data. This is done by cloning the packet.
Clear data from packet, because we can’t do that in clone()
Create a new packet, with the same parameters and data. We expect data as the argument. There may be keywords. If data is given, set the data string to this value.
TO-DO: Cloning message bottles?
ROBDOC : When sending on data it would be useful if the data keyword would raise an exception if no data is in the packet OR just allow there to be no data in the packet (rather than cloning the original data if you provide data as ‘’). This depends on whether you’re going to allow a packet without any data:
e.g.
clone(self, data=None, ..):
..
if data is not None:
if data is '':
raise NoDataInPacketException, 'data is required in the packet'
cloned_packet.data = data
Length of data, if a string
Bases: type
Use this metaclass, derived from “type”, to create the class. This sets a flag for checking that it is being used, and sets __getattribute__ to ensure dynamic processing. This is variable at run time, as to whether we use the metaclass or not, but is not reversible.
Note that __getattribute__ is a class attribute, not an instance attribute. If we set the instance attribute, it is ignored and thus not dynamic.
return a type’s method resolution order
Bases: object
Additional base class for Filters, where each attribute access is checked for beginning with “%”, i.e. requiring a dynamic value. Dynamic property is not easily reversible.
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.FilterError
Bases: exceptions.Exception
An exception was caught and raised from while processing data in a Filter.
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.DataFilter
Route the data packets according to their fork_destination. ‘main’ goes on to the next_filter as usual. ‘branch’ goes to branch_filter, raising an exception if there is none. ‘prev’ may link to the filter before the branch.
This is called Hidden... because you shouldn’t need to use it explicitly. Since it is never created by ftype, we need to set ftype manually.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataPacket
Message is a type of data packet, usually with only one-time use. MessageBottle —–TO-DO—– e.g. target = pad_bytes, message = reset:byte_count:100
@destination : can be the filter name or the filter type. If filter type, the message is forwarded onto the rest of the pipeline main and branch so that other filters of the same type can open the message.
Clear data from packet, because we can’t do that in clone()
Create a new packet, with the same parameters and data. We expect data as the argument. There may be keywords. If data is given, set the data string to this value.
TO-DO: Cloning message bottles?
ROBDOC : When sending on data it would be useful if the data keyword would raise an exception if no data is in the packet OR just allow there to be no data in the packet (rather than cloning the original data if you provide data as ‘’). This depends on whether you’re going to allow a packet without any data:
e.g.
clone(self, data=None, ..):
..
if data is not None:
if data is '':
raise NoDataInPacketException, 'data is required in the packet'
cloned_packet.data = data
Length of data, if a string
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.FilterError
Bases: object
Priority queue to enable looping, using TankQueue and TankFeed. List is sorted by heapq, using priority as the first sort field. If priorities are the same, then time is used to distinguish order. See p.208 of Python Cookbook, 2nd ed.
Empty the queue of all items
TO-DO: Establish why time_posted isn’t used to sort priority of responses when multiple priority values are the same. Note: This only appeared on WinVista.
Push None on to the queue, with maximum (negative) priority so that it goes to the front of the queue. This is called when the queue size is changed by padding the front.
Bases: filterpype.data_fltr_base.FilterError
Bases: filterpype.data_fltr_base.DataError
Bases: filterpype.data_fltr_base.DataError
Bases: filterpype.data_fltr_base.FilterError
Class decorator to add the functionality to look up from the embedded Python environment the current values of attributes whose apparent values start with “%” and the name is upper case.
Usage: put “@dynamic_params” on the line before the class decoration
Problem is that this changes the static class for all uses of it. We may not want all instances dynamic.
Mixes in place, i.e. the base class is modified. Tags the class with a list of names of mixed members.
Same as mix_in, but returns a new class instead of modifying the base.
Bases: filterpype.data_fltr_base.DataFilter
Examines a list of attributes for change in value. Sets the packet_change_flag attribute to True upon change of any of the attributes, otherwise flag stays False. AttributeError raised if packet has not got all attributes. Change of initial value.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Extract attributes from text strings using a delimiter to determine the split between key (on the left) and value (on the right). Key has all punctuation removed and spaces replaced with underscores and is assigned to the packet dictionary.
Sample input: ‘ someone’s Gender : is maiL’ Packet output: packet.someones_gender = ‘is maiL’
Beware: do not override reserved attributes within the packet (such as ‘data’)
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Take the input stream and compress it using bzip2 compression object. Use level 9 for large files (this is the default).
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Flush compression buffer before closing
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Take the input stream and decompresses it using bzip2.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Input is a series of strings of any length, with header removed. Split up data into blocks.
There are two general cases to cope with: a) the input string is larger than the block size b) the input string is smaller.
To avoid having two different approaches, we use an inputs list as a buffer: 1) Repeatedly put the packet data into inputs until chars_in >= size 2) Join inputs to be one string. 3) Split string into blocks, leaving a remainder. 4) Send on each block. 5) Put remainder back as the only item in inputs.
Wrong: An added feature is that we can set an initial_branch_size that sends the first N bytes off to a branch. This enables us to strip off junk at the beginning of a block of data, by sending it to a branch where it goes to waste. Alternatively, it can resynchronise frames of data, by pointing the branch and the main to the same filter following in the pipeline.
##We can set more than one batch size, using the size parameter for a list
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Clone the incoming packet and store within self._last_packet for flush_buffer().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Split the data, allowing size and fork_dest to be changed each loop. We need to check batch size each loop, in case it has been changed to a silly value, e.g. 0, which will just carry on sending the same data ad infinitum.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Zero batch size gives ValueError: range() step argument must not be zero
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Clones (duplicates) the packet, sending one copy to main, one to branch. BranchClone filter should be followed by a HiddenBranchRoute filter, i.e. “(” in the route.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Sends along the main and/or branch depending on two variables within the embedded environment. If the variables have not been set, this filter will die loudly raising an AttributeError.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Send the first part of the packet.data to the branch, and the rest to main. This differs from DistillHeader because the amount to send is read from the packet, and to keep things simple, must be >= packet data length.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Decide on branching, dependent of the name of an attribute ‘branch_key’, found either in the filter or the packet. Optional key is ‘branch_on_packet’, with default of True. If branch_on_packet is False, then use filter instead of packet. If the attribute is not present, a KeyError will be raised. BranchIf should be followed by HiddenBranchRoute filter, i.e. “(” in route.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Sends packet down the branch forever after the first case where watch_attribute does not equal watch_value.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Send parameter results to the branch. Optionally send only some of the list items.
ROBDOC : sorry, what does this do? Filter a list using slice / dice? CJ
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Send the packet object to both branch and main. This is without cloning it, i.e. this is not a copy but a reference to the same object! Any changes made by the branch will affect the packet it refers to. BranchRef filter should be followed by a HiddenBranchRoute filter, i.e. “(” in the route.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Debugging in a pipeline that has no convienently placed filters can be difficult so this is a filter purely for being able to insert a break point in
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Calculate the rate of change of a parameter, over five consecutive values, given a list of packets, from which we can get
[h0, h1, h2, h3, h4]
The packet arriving should contain references to the five packets we need to differentiate. They are still continuing in the main pipeline. If there are not precisely five packets, then pass on the grouping packet for other calculating methods.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Simple calculator for two numbers
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Watches packets for a specified watch attribute and calls the provided callback method with the attribute value as a parameter,
e.g. If watch_attr = ‘holy’ and no environ provided, the following callback will be made if a packet arrives with the ‘holy’ attribute with the value ‘grail’
self.callback(‘found:holy’, holy=’grail’)
num_watch_pkts is the number of packets which can pass through the filter until it will stop watching for the attribute. If the attribute has not been found by this point, it will return a callback:
self.callback(‘not_found:holy’)
allowed_inconsistencies: self.callback(‘inconsistency_value_exceeded:holy’)
If num_watch_pkts is None (default) it will watch forever and only return a not_found callback when closing the filter.
count_to_confirm is the number of identical values of the watched attribute required to pass through the filter before it will return a callback.
watch_for_change will make a callback only when the attribute changes it’s value from the previous (including the first assignment of the value in the first packet).
include_in_environ allows you to provide a list of additional packet attributes to include in the environ used in the callback. (watch_attr is always included where available)
if the parameter is not found by the time the pipeline closes down, the response is made “not_found:<watch_attr>”
if the value of watch_attr is None, it is ignored (passed through).
changes the refinery send() method’s return value to ‘found’, ‘not_found’ or ‘inconsistant’
print_when_callback allows you to aid your debugging by printing to stdout when a callback is made
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Make a not_found callback if the attribute was not found and num_watch_pkts is set to None (which watches forever) and we haven’t made an inconsistent call
Embedded Python module
Used by open_message_bottle too - but as message bottles get sent on automatically, we don’t want to send it if there is a message in the packet. (TO-DO discuss) <– there should be a tidier way of doing this!
TO-DO: Split the getattr and functionality within try except into seperate private function, then send_on can be handled outside of that function.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
This must do something very similar to the filter_data class, but not send_on the message bottle - as this is arranged for us in the DataFilter base class.
Work around for the above is to only send on in filter_data when there is a message attached to the packet.
Currently it will open all messages from the msg_bottle. this may or may not be a good idea, but it does allow for items to be clocked up on the “not” found count. <– however, it is also worth noting that if the message destination wasn’t set and it defaulted to this ftype, it may get more messages than intended.
will count_to_confirm etc. etc. work? are they sharing resources with filter_data counts?
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Simplified version of data_filter’s CallbackOnAttribute which will make a callback based on multiple attributes. watch_attrs is comma separated, though is parsed as a list by FilterPype. Will only make the callback once when all attributes have been found.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Extract the data from each packet into an extract dictionary. A centrifuge map is a dictionary in the form
key_attribute_name = tuple_with_extraction_parameters e.g. = (word_no, high_bit_no, low_bit_no) superframe_number = (3, 8, 1)
Note that all numbering systems from the analysts point of view are 1-based, so have to be converted to 0-based before use.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Record data from each passing packet, until a maximum collection size is reached. Then send to the branch a packet with data of all the values in a list.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Combine a list of fields or constants into one target field. Source field names will use the value of the field, which is prefixed either with ‘f.’ for filter attribute, or ‘p.’ for packet attribute.
source_field_names = f.header_prefix, p.seq_num, XXXX
produces
===+++ 27 XXXX
If the source field name is not prefixed with f. or p., it will be treated as a constant. Target field name always starts with f. or p.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Concatinate path
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Convert a string of hex values (e.g. ‘x01x02x03’) into a integer representation of the given string.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Join a file path to a file name to give a full file name
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Count the number of bytes passing a filter. Unlike SeqPacket and CountLoops, nothing is written to the packet, just to the filter. Count packets as well, but not in a custom field.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Give each packet a number that starts at 1 and increments for each pass. This is used for counting loops
Contrast this with CountPackets that records in the filter the number of packets going past.
Also contrast this with SeqPacket that gives the packet an ID number. The next number is taken from the filter, and the packet number is not overwritten if it already exists.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Count the number of packets passing a filter. Unlike SeqPacket and CountLoops, nothing is written to the packet, just to the filter.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Calculates the length of data that has passed through the filter.
When closing the filter, it sends a message bottle to the msg_destin which defaults to the ftype of callback_on_attribute.
not_data must be a single chr value data_length(0 to 256)
single_use means that the message bottle it sends will be used by the first msg_destin found (i.e. defaults means the first callback_on_attribute it comes across)
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Takes a list of data as an input, and outputs the set of different values. This can be used to ensure that a parameter read multiple times has the same value.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Strip header off, and send header_size bytes to the branch. If distill_mode is “once” then the header will be removed only once, from the first packet, while if distill_mode is “repeated”, it will removed from each packet that is received.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
EmbedPython TO-DO
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Replace keys from the enclosing pipeline where they are of the format “${fred}”
ROBDOC: Will only work on strings / primitive data types (not objects)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Format results received in a list, to be passed on as a string. param_name may be an attribute of the results packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Get the raw data value for the target byte range.
Mandatory keys are: start_byte (int) bytes_to_get (int) param_name (string) Uppercase please
NOTE: All counting is base 0 for developers. Data extracted is sent to branch;
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Takes the input stream and computes a hash using the sha-256 algorithm.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
In cases where the header is required for further processing later in pipeline, this filter assigns the header_attribute to the packet, while the packet’s data will no longer contain the header.
The send_on_if_only_header key defines whether this filter should send_on packets if there is not enough data to split into both header and remaining data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Takes in a dict of attribute_name:value and
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Store the data items that are strings, until receiving something where the data is not a string, typically None. Then join the strings together, using space as the default join string.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Check that the first n bytes of data for being non-zero (i.e. not 0x00 or 0xFF). If the first n bytes are 00/FF and all the same, don’t pass on the data.
If the length of the data packet is less than check_byte_count, then the packet should pass, because the check has failed, even if the data is 00/FF.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
A pass through filter just forwards all packets to the next filter. There are various uses for a null node, such as being able to redirect pipeline flow while the pipeline is active. Alternatively, it may be used for simulating multiway branching, with a syntax built around binary branching.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Look ahead some bytes into the next data packet. Record the bytes found in the packet, if bytes are found, or an end of file marker <<<TO-DO<<< if the data has finished.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Format results received in a list, to be passed on as a string. param_name may be an attribute of the results packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Reset to zero an attribute in the next filter.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Check that we have reset followed by a branch, in order to have somewhere to send the reset to.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Chop file up into string blocks to pass inside packets into pipeline.
We need a file object to read from. This can be provided directly or indirectly, with either the open file object or the file name being sent to ReadBatch.
This can also be done by setting the source_file_name as a fixed parameter for the filter, but this stops more than one file being read and doesn’t fit so well with the idea of data filters.
The file object passed in (or opened) signalled for closing in three ways: 1) The file has been consumed by reading, so the next read() returns a block of zero length. 2) The refinery is shutting down (checked on each read() loop). Whether or not the reading is finished, the file should be closed.
Parameters: | max_reads (int) – Number of batches to read. |
---|
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Simple ReadFile filter to read files using bytes
Notes about keys:
source_file_name - the source file to read from start_byte - the starting byte. Must be positive integer. size - number of bytes to read.
zero : read nothing positive int : read this number of bytes negative int : read all of file (covention is to use -1)
block_size - size in bytes to read at a time. whence - where to seek from:
0 - Start of file 1 - End of file
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Read lines of a text file, in normal or reversed order. For a first implementation, this requires the whole file being in memory, but this could be optimised.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Rename file, with from/to names passed in as packet data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Send a message bottle to a downstream filter, to reset one of the parameters. The message bottle will pass along the pipeline until it gets to a filter where the name matches. Then the parameter will be set to the new value.
The ‘value’ key can be either a proper value (such as 3 or ‘hello’) or the name of a packet attribute where the value will be retrieved. The packet attribute takes precedence over the Filter key.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Reverse a string (in packet.data) that is of at least 2 characters long. The result will be stored back into packet.data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Send a message in a bottle to a downline filter. The message bottle will pass along the pipeline until it gets to a filter where the name matches. Then the parameter will be set to the new value.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Give each data packet a sequential number (e.g. as an ID), starting from 0 or whatever reset_counter_to is set to. To allow for looping, the seq_packet_field_name will not be overwritten. If you need to add another numberer, use a different field.
Message bottle packets don’t have to be handled, because they are not sent to filter_data().
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Creates delimited data from attribute lists.
Changes a list of attributes (columns) and joins them into a list of rows. Each row is joint together, delimited by the seperator (default ‘,’ comma) . All the rows are then joint with the end of line ‘eol’ seperator and put into the packet.data.
Where write_field_headers is True, attribute names are used for field headers at the start of the data.
It will accept attribute lists of different lengths. When a list runs out of entries, it is padded with ‘ ‘ (space char) values at the end.
output_format will apply a formatting to each and every value. str : applies string representation of value (default) int : applies integer base 16 to value (requires int / str of an int) hex : applies hexidecimal to value (requires integers) bin : applies binary to value (requires integers) oct : applies octal to value (requires integers)
method_name is the name of a method to call on an object if the attribute list contains objects rather than literal values. Method name shall not require any arguments. If it does then it’ll fail as it’s supposed to be for simple getter style methods, not complicated things.
as “delete” etc are not written to file!
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
# http://code.activestate.com/recipes/410687/ #def transposed(lists):
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Store the output of the previous filter in a results list. Check the latest addition to the list with results[-1].
Set max_results to limit the number of results stored in the list. If there are more than max_results, the oldest gets popped off the top of the list. If max_results is 0, then there is no limit, and all results are stored. The default limit is 20.
Set the sink’s capture_msgs=True in order to capture MessageBottles in addition to DataPackets.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
List of sink packet data
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Simply sleeps for the specified time and sends on incoming packets. Intended for debugging purposes, watching printed output to understand the flow of a pipeline.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Split the data provided into lines. Only sends on lines which have data within them. It will include any whitespace on the line.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Split the data into chunks, looking for some character string to split on. Uses white space as the default.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Reverse a string (in packet.data) that is of at least 2 characters long. The result will be stored back into packet.data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Apply a tag to each packet, from a tag_field_name.
This tag value is updated by a send_tag filter setting the tag property.??
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_filter.TankQueue
TankQueue that sends references to its packets held to the branch every time a packet is received. This enables the branch to process a sliding window, wihtout duplicating any data.
Hook to execute just after a packet is sent on
Tank packet data, sorted and concatenated
Allow proper processing of data being flushed down main. Avoid catching the packet lists, which are normally sent down the branch.
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Each packet into the priority queue should be numbered, but if not numbered, we need to set priority to 0. In this case, the priority ordering will be on the time.time() set at time of push. Or should not being numbered raise an exception?? TO-DO
PriorityQueue now gives a sequential number, rather than 0, so this is the default sorting mechanism.
Packets are pushed with a given priority, whereas None is pushed with a special negative priority, to ensure it comes at the front.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
All data in tank packet, sorted in priority order
Count of packets needed to match tank_size
Adjust the tank size to change the number of packets held.
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Take a (yield)ed packet and send it to a tank_queue, but not through the (yield), to avoid ValueError: “generator already executing” ROBDOC: This does not explain the main purpose of this filter.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
To avoid the locking up of the coroutine links by circular calls, we push packets on to a tank queue instead of sending them. But the destination is taken from the same pipeline structure.
Bases: filterpype.data_fltr_base.DataFilter
Store the input packets in a priority queue, waiting until tap_open before sending on the packets.
This description is in bits, and reads like it: pretty incomprfwehensbleuhg
This decoupling is needed to allow coroutines to loop.
All packets should have been numbered by a previous filter, to ensure that the earlier packets are processed first, in particular, that looping packets are processed before new ones. If the packets haven’t been numbered before, they are given sequence number zero
<<<<<<<<<<<<<<<<<<< TankQueue <<<<<<<<<<<<<TO-DO<<<<<<<<<<<
If the factorial hasn’t recursed down to 1, it branches to tank_feed, to calculate one less factorial. tank_feed put the new values back into the tank_queue, by pushing directly on to the tank_queue’s priority queue. The tank_queue gives the packets a sequential number to use in the priority queue, to ensure that all the recursive calculations on one packet are finished before the next packet is processed.
The packet coming into the tank_queue by the normal (yield) is pushed on to the queue. Then a while-True loop takes all the packets out of the queue for processing, which means that it completes the entire recursion before going back for the next (yield).
Fill the tank with up to tank_size packets. When next packet arrives, the first one in is sent on.
When tank_size is changed, either the excess packets are sent on, or the front end is filled out with None.
##The normal send_now() behaviour is to queue all the packets until the ##current filter_data() function has been completed. This doesn’t work for ##recursive use of the tank, because the packets must keep recursing round. ##Therefore we have a send_now key which can be set to True when needed.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Tank packet data, sorted and concatenated
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Each packet into the priority queue should be numbered, but if not numbered, we need to set priority to 0. In this case, the priority ordering will be on the time.time() set at time of push. Or should not being numbered raise an exception?? TO-DO
PriorityQueue now gives a sequential number, rather than 0, so this is the default sorting mechanism.
Packets are pushed with a given priority, whereas None is pushed with a special negative priority, to ensure it comes at the front.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
All data in tank packet, sorted in priority order
Count of packets needed to match tank_size
Adjust the tank size to change the number of packets held.
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Reset to zero an attribute in the next filter, to a value dependent on the current packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Send tag property of incoming packet data to a tag_packet filter, by setting its <tag_field_name> attribute.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
A waste filter just throws away all the packets it sees. This is used when combining results from branches and the main stream is not wanted.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Wrap packet data with a prefix and/or suffix. This can be used for inserting periodic strings (e.g. creating a regular file header within a stream of data) or padding missing data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Creates a ConfigObj object and adds new sections to it from incoming packet’s data. Only accepts packet.data as a dictionary in the following format:
{ config_obj_section_name (str) : config_obj_section (dict)* }
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
When the pipeline is shutting down, write out self._config_obj if write_config is True.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Write data from all packets to an external file.
Write the data to the file, opening the file if necessary first. File opening is left to this point, to allow the changing of the output file until after the initialisation of the generator.
So how do we know when the file has finished and needs closing? The input to read_batch could be many files, all to be written to one output file. We can’t use a timeout, so closing needs to be done explicitly, or via the closure of the pipeline.
By providing a message bottle with the message ‘change_write_suffix’ and the packet attribute ‘packet.file_name_suffix’ the current file will be closed and a new one will be opened with the suffix appended to it.
compress : currenly only ‘bzip’ is enabled (true / bzip resolves to bzip2)
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Hook to execute just before a packet is sent o
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.filter_factory.FilterFactory
Make filters for the test spike
Extract the right filter dictionary from the factory_dict. Store a reference in the object to the factory that made it. Pass in the filter_attrs so the new object has right attributes. Alternative parameter is just the ftype. Dictionary validation is enforced before the first use.
TO-DO: pipeline is (a str? an object? required??)
Bases: object
Factory class for making Filters and Pipelines
Extract the right filter dictionary from the factory_dict. Store a reference in the object to the factory that made it. Pass in the filter_attrs so the new object has right attributes. Alternative parameter is just the ftype. Dictionary validation is enforced before the first use.
TO-DO: pipeline is (a str? an object? required??)
Returns the absolute path of the package containing abs_file_location, usually found from __file__.
Given a search path, yield all files matching the pattern. From Python Cookbook, p. 92.
Glen - 07/01/10 - Changed logic incase space is an empty string.
Single character bcd conversion.
e.g. bcd3(‘A’,’B’,’C’,space=’ ‘) -> ‘41 42 43’
Add the bits in all chars
ConfigObj puts strings with commas into a list, converting to integers where possible. Reconstruct the string with string types. Rejoin the list items with a comma, if present.
Apply convert_config_str for each item in the list.
With string from config file, interpret it in order: 1) as a number (integer, float or hex) 2) as Boolean 3) as space 4) list 5) dictionary 6) else return original string.
Use this function for converting a list of integers into a hex representation in a format determined by mode.
E.g. If input is [1, 2, 3] Mode Format 1 [‘0x1’, ‘0x2’, ‘0x3’] 2 ‘0x010203’ 3 ‘x01x02x03’ 4 ‘01 02 03’
Convert a data string into it 2s complement value E.g. 0xF22C4158 is the hex string 0xF2 0x2C 0x41 0x58 0xF2 0x2C 0x41 0x58 –> -231980712 (in 2s complement)
NOTE: values must be strings NOTE: Please only give chars (1 byte), shorts (2 bytes) or
longs (4 bytes). Anything else will currently return an error
For each number in the tuple, return one less, converting 1-based counting to 0-based.
Return the hexadecimal representation of a string of bytes, so that it can be printed and compared with the display of a hex editor.
e.g. ‘ABC’ –> ‘41 42 43’
Return the type of data passed in.
Print the text to std_out if “debug” in this module is >= level. i.e. If level is 0, then it will always be printed. The higher the level, the less likely it is to be printed. The “>” is inserted just to show which print statements have been converted to dbg_print.
Essential key values may be set after a ‘:’ for the each essential key. e.g. foo:35:fred:${baz} This provides three key values to the filter foo: 35 (as an integer), ‘fred’ (as a string) and baz as a variable to be substituted for.
Get the word interval and number of results per superframe, given the words per second and hertz
sensible values and a calculation is made that results in non-zero values on the right hand side of the decimal point, an inaccurate result will be given.
EG: Inputs: WPS:256, HZ:3 Real Interval: 85.333333333333329, Real num results: 48.0 Function Output: (85, 48)
Read string data as hexadecimal bytes, and return as data string. e.g ‘41 42 43’ –> ‘ABC’
Note the use of the int() function, with a second parameter. I hadn’t realised that this was necessary for converting anything apart from base 10 strings.
Converts a long from network order (big-endian) to a string of length 4 in host order.
Equivalent of C OS library function of the same name.
Source: http://www.java2s.com/Tutorial/Python/0280__Buildin-Module/NetworkByteOrder.htm
Converts a short from network order (big-endian) to a string of length 2 in host order.
Equivalent of C OS library function of the same name.
Source: http://www.java2s.com/Tutorial/Python/0280__Buildin-Module/NetworkByteOrder.htm
Take a list with some repeated keys which have different default values. Return the only key with the latest value, in the position of the first occurrence of the key. e.g.
keys_in = [‘abc’, ‘def:0’, ‘fred’, ‘jane:ergo’, ‘def:3’] keys_out = [‘abc’, ‘def:3’, ‘fred’, ‘jane:ergo’]
Ease the dictionary making process by removing the need for quoting the keys. *args is for tuples coming from an existing dictionary, using the *adict.items() form. **kwargs forms keyword parameters into a dictionary.
How many nibbles needed to show required numbers of bits?
Converts a string of length four from network order (big-endian) to a long in host order.
Equivalent of C OS library function of the same name.
Source: http://www.java2s.com/Tutorial/Python/0280__Buildin-Module/NetworkByteOrder.htm
Converts a string of length two from network order (big-endian) to a short in host order.
Equivalent of C OS library function of the same name.
Source: http://www.java2s.com/Tutorial/Python/0280__Buildin-Module/NetworkByteOrder.htm
Pad input string to the length with pad_char. If length < length(text), return the whole string.
redirect(func, ...) –> (output string result, func’s return value) (See p.257 of Python in a Nutshell) func must be a callable and may emit results to standard output. Capture those results as a string and return a pair: the print output and func’s return value.
Return a string with all unprintable characters replaced with ‘?’
Create generator to return all pypes in the pypes directory.
Returns a unique random file name generated as a UUID string. This is statistically guaranteed to avoid a name clash.
Removes all punctuation chars from string_in
Punctuation removed: ‘!”#$%&’()*+,-./:;<=>?@[]^_`{|}~’
Return the input string as a list of strings, split into split_size pieces. The last piece sent may be an incomplete split.
Remove any trailing digits from a string and underscore.
Remove the text up to and including the separator, if there is one, or else return the original text.
Remove the same number of spaces from the front of each line, so that at least one line has no leading spaces. Return a list of the stripped lines. Ignore blank lines, even though they may have leading spaces.
Tokenises and parses the route part of a filter config in a pipeline.
Remember to delete lextab.py/.pyc and parsetab.py/.pyc if anything changes, otherwise the previous version will be used.
Bases: object
Parse the pipeline route, defined by a list of filters, with embedded parentheses.
branch : LPAREN start_branch pipe RPAREN end_branch
end_branch :
Error rule for syntax errors
pipe : FILTER
pipe : pipe JOINTO FILTER
pipe : pipe JOINTO branch FILTER
start_branch :
(#).*
(([_A-Za-z.:-%${}])(([0-9])|([_A-Za-z.:-%${}]))*)
n+
Bases: filterpype.pipeline.Pipeline
Check that parameters can be set in various ways.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Take in a file object via data stream, and write it to a named file.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Adds basic compression to the Copy File example above along with callback environment in order to report on read progress.
Source file name is optional, as it can also accept the filename / file object as the data parameter within the first packet passed into the pipeline.
Uses callback and environ to make progress reports.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Splits data into lines and extracts attributes based on a delimiter parameter, the attributes are stored in the packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Splits data by split_on_str and extracts attributes based on a delimiter parameter, the attributes are stored in the packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
A pipeline is a filter that has a filters dictionary which keys by name all the filters in the pipeline.
The mechanism of how the filters work need not be known, for a higher level of functionality.
The pipeline needs to know which factory to use to make its data filters. This would not normally change, so is passed in with the constructor.
To make testing easier, we define an input binary file name, and one or more output files.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
For testing automatic pipeline creation
BL - 28/8/09 - Added in a config to force it to have an ftype as a test was failing in FDS specific files due to it being missing
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: object
Bases: filterpype.pipeline.Pipeline
Recursion to calculate factorials.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Test inner pipeline class
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.ppln_demo.Freda
Bases: filterpype.pipeline.Pipeline
Takes string input and returns a reversed string.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Simple loop to print counter.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Test inner small baz pipeline
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Test simple function to square numbers in a pipeline.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Spike for putting data into multiple pipes.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.pipeline.Pipeline
Take each word in the data and capitalise.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
This is the minimum functionality for a pipeline that does nothing. But some pipelines may want to override this, setting parameters before sending any data.
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
TO-DO>>>> Return the filter from the _filter_dict, if it has already been made,
else make the filter and then return it. This is necessary now because of allowing an empty config. Reference to a filter will create it. Test the name first for being in the format “some_name:23” where 23 is the value of the first key.
Initialise coroutine vaiables that are needed before we reach the (yield) statement. Override in subclasses.
Location of embedded Python module
Name for filter/pipeline
Sets internal last_filter to have same next_filter
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Put the data into the pipeline by wrapping it in a Datapacket. Get the data out by temporarily sticking a Sink on the end, returning all the packets received. Remember to close the pipeline externally, if tidying up actions need to be performed.
If zero_inputs is True, reinitialise the variables before pumping the new data in. <<<<<<<zero_inputs<<<<<<<<< TO-DO <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Update filter parameter values after the creation of all the filters
Update pipeline route after the creation of all the filters
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Adds the numbers in the input list, passing on the result in the packet. We’re expecting either a Python list of numbers, or a comma- separated string to be read as a list.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Count how many of each byte value have passed through the filter.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Capitalise the first character of the packet’s data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Experiment to see if we can loop recursively with filters. We can’t: ValueError: generator already executing. Aha! We have a solution. Packets in will have attributes:
‘x’ – the number to get the factorial of. ‘pending’ – intermediate calculation ‘x_factorial’
Until calculation is finished, packet will be sent via the branch to the tank_queue that feeds this filter. <<<< rewrite TO-DO <<<<
Follow FactorialCalc by BranchIf:recurse
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Only exists to test pipeline key substitution.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
If an input is an integer, or a string integer, return a multiple of the number as a string, else return the original string, in a packet.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Print first part of data block in hex.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Filter to return a reversed string for each yielded packet
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Print char to screen to show progress
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
At the end of the pipeline, print an extra new line before any final output from the calling program.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
If a string input is an number, return the square of the number, as a number, else leave packet alone.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
However much the bits are shuffled or padded with zeros, their sum should not change.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
However much the bytes are shuffled, their sum should not change.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
However much the nibbles are shuffled or padded with zeros, their sum should not change.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
Bases: filterpype.data_fltr_base.DataFilter
Filter to return data with some bits switched
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Insert a space between each character in the data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Put string after the data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<
Bases: filterpype.data_fltr_base.DataFilter
Put string in front of the data.
Override this to perform some action just after the incoming packet is processed by filter_data().
Hook to execute just after a packet is sent on
Override this to perform some action just before the incoming packet is processed by filter_data().
Hook to execute just before a packet is sent o
Override this to add to the closing functionality before the filter is finally closed.
Embedded Python module
Override this for clearing out any buffered data: always before final close_filter(), but may be needed at intermediate stages.
Location of embedded Python module
Name for filter/pipeline
Open the message bottle, and take appropriate action. We know what to do with these general purpose commands, applying generally to any filter:
reset
Other functionality may be needed, specific to one filter. In this case, override the open_message_bottle() in the filter. e.g. WriteFile uses this to close one file and open a new file with a different name.
Top-level pipeline
For the first filter in the pipeline, send in the starting data. This must be a DataPacket object.
If not already primed, call _prime() to send a next() call to the generator.
Send on packets to their destination.
Shut down the pipeline/filter by setting a shutting_down flag. It doesn’t work to pass close() straight to the first filter, because the generator sequence is still busy and comes back with an error message: “ValueError: generator already executing”
Try to close the pipeline, but don’t worry if we can’t because it’s busy: there must be a loop somewhere that should be checking the shutting_down read-only property.
Is refinery shutting down (read-only)
Override this function to check validity of input parameters. This is called before init_filter(), which may use the input parameters to calculate derived params.
During initialisation of a filter, there may be inputs and counters that need to be set to zero. By putting these in a separate function, we avoid duplicating code where clearing is required repeatedly.
Some filters that use carries and remainders may require clearing inputs before each use, to give a consistent one-off answer, for something like pump_data().
zero_inputs() is called by the closing() context manager, but not until the generator receives its first data packet.
Should this be called init_filter_dynamic <<<<<< TO-DO <<<<<<