Initial import.
git-svn-id: https://spiff-workflow.googlecode.com/svn/trunk@2 e8300cfb-4550-0410-bd11-c1b1d6e01c17
This commit is contained in:
parent
17e06d6ca4
commit
a7e4ffc8b3
|
@ -0,0 +1,6 @@
|
|||
*.py[co]
|
||||
dist
|
||||
build
|
||||
*.egg-info
|
||||
handbook
|
||||
unit_test.cfg
|
|
@ -0,0 +1,165 @@
|
|||
GNU LESSER GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
|
||||
This version of the GNU Lesser General Public License incorporates
|
||||
the terms and conditions of version 3 of the GNU General Public
|
||||
License, supplemented by the additional permissions listed below.
|
||||
|
||||
0. Additional Definitions.
|
||||
|
||||
As used herein, "this License" refers to version 3 of the GNU Lesser
|
||||
General Public License, and the "GNU GPL" refers to version 3 of the GNU
|
||||
General Public License.
|
||||
|
||||
"The Library" refers to a covered work governed by this License,
|
||||
other than an Application or a Combined Work as defined below.
|
||||
|
||||
An "Application" is any work that makes use of an interface provided
|
||||
by the Library, but which is not otherwise based on the Library.
|
||||
Defining a subclass of a class defined by the Library is deemed a mode
|
||||
of using an interface provided by the Library.
|
||||
|
||||
A "Combined Work" is a work produced by combining or linking an
|
||||
Application with the Library. The particular version of the Library
|
||||
with which the Combined Work was made is also called the "Linked
|
||||
Version".
|
||||
|
||||
The "Minimal Corresponding Source" for a Combined Work means the
|
||||
Corresponding Source for the Combined Work, excluding any source code
|
||||
for portions of the Combined Work that, considered in isolation, are
|
||||
based on the Application, and not on the Linked Version.
|
||||
|
||||
The "Corresponding Application Code" for a Combined Work means the
|
||||
object code and/or source code for the Application, including any data
|
||||
and utility programs needed for reproducing the Combined Work from the
|
||||
Application, but excluding the System Libraries of the Combined Work.
|
||||
|
||||
1. Exception to Section 3 of the GNU GPL.
|
||||
|
||||
You may convey a covered work under sections 3 and 4 of this License
|
||||
without being bound by section 3 of the GNU GPL.
|
||||
|
||||
2. Conveying Modified Versions.
|
||||
|
||||
If you modify a copy of the Library, and, in your modifications, a
|
||||
facility refers to a function or data to be supplied by an Application
|
||||
that uses the facility (other than as an argument passed when the
|
||||
facility is invoked), then you may convey a copy of the modified
|
||||
version:
|
||||
|
||||
a) under this License, provided that you make a good faith effort to
|
||||
ensure that, in the event an Application does not supply the
|
||||
function or data, the facility still operates, and performs
|
||||
whatever part of its purpose remains meaningful, or
|
||||
|
||||
b) under the GNU GPL, with none of the additional permissions of
|
||||
this License applicable to that copy.
|
||||
|
||||
3. Object Code Incorporating Material from Library Header Files.
|
||||
|
||||
The object code form of an Application may incorporate material from
|
||||
a header file that is part of the Library. You may convey such object
|
||||
code under terms of your choice, provided that, if the incorporated
|
||||
material is not limited to numerical parameters, data structure
|
||||
layouts and accessors, or small macros, inline functions and templates
|
||||
(ten or fewer lines in length), you do both of the following:
|
||||
|
||||
a) Give prominent notice with each copy of the object code that the
|
||||
Library is used in it and that the Library and its use are
|
||||
covered by this License.
|
||||
|
||||
b) Accompany the object code with a copy of the GNU GPL and this license
|
||||
document.
|
||||
|
||||
4. Combined Works.
|
||||
|
||||
You may convey a Combined Work under terms of your choice that,
|
||||
taken together, effectively do not restrict modification of the
|
||||
portions of the Library contained in the Combined Work and reverse
|
||||
engineering for debugging such modifications, if you also do each of
|
||||
the following:
|
||||
|
||||
a) Give prominent notice with each copy of the Combined Work that
|
||||
the Library is used in it and that the Library and its use are
|
||||
covered by this License.
|
||||
|
||||
b) Accompany the Combined Work with a copy of the GNU GPL and this license
|
||||
document.
|
||||
|
||||
c) For a Combined Work that displays copyright notices during
|
||||
execution, include the copyright notice for the Library among
|
||||
these notices, as well as a reference directing the user to the
|
||||
copies of the GNU GPL and this license document.
|
||||
|
||||
d) Do one of the following:
|
||||
|
||||
0) Convey the Minimal Corresponding Source under the terms of this
|
||||
License, and the Corresponding Application Code in a form
|
||||
suitable for, and under terms that permit, the user to
|
||||
recombine or relink the Application with a modified version of
|
||||
the Linked Version to produce a modified Combined Work, in the
|
||||
manner specified by section 6 of the GNU GPL for conveying
|
||||
Corresponding Source.
|
||||
|
||||
1) Use a suitable shared library mechanism for linking with the
|
||||
Library. A suitable mechanism is one that (a) uses at run time
|
||||
a copy of the Library already present on the user's computer
|
||||
system, and (b) will operate properly with a modified version
|
||||
of the Library that is interface-compatible with the Linked
|
||||
Version.
|
||||
|
||||
e) Provide Installation Information, but only if you would otherwise
|
||||
be required to provide such information under section 6 of the
|
||||
GNU GPL, and only to the extent that such information is
|
||||
necessary to install and execute a modified version of the
|
||||
Combined Work produced by recombining or relinking the
|
||||
Application with a modified version of the Linked Version. (If
|
||||
you use option 4d0, the Installation Information must accompany
|
||||
the Minimal Corresponding Source and Corresponding Application
|
||||
Code. If you use option 4d1, you must provide the Installation
|
||||
Information in the manner specified by section 6 of the GNU GPL
|
||||
for conveying Corresponding Source.)
|
||||
|
||||
5. Combined Libraries.
|
||||
|
||||
You may place library facilities that are a work based on the
|
||||
Library side by side in a single library together with other library
|
||||
facilities that are not Applications and are not covered by this
|
||||
License, and convey such a combined library under terms of your
|
||||
choice, if you do both of the following:
|
||||
|
||||
a) Accompany the combined library with a copy of the same work based
|
||||
on the Library, uncombined with any other library facilities,
|
||||
conveyed under the terms of this License.
|
||||
|
||||
b) Give prominent notice with the combined library that part of it
|
||||
is a work based on the Library, and explaining where to find the
|
||||
accompanying uncombined form of the same work.
|
||||
|
||||
6. Revised Versions of the GNU Lesser General Public License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions
|
||||
of the GNU Lesser General Public License from time to time. Such new
|
||||
versions will be similar in spirit to the present version, but may
|
||||
differ in detail to address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Library as you received it specifies that a certain numbered version
|
||||
of the GNU Lesser General Public License "or any later version"
|
||||
applies to it, you have the option of following the terms and
|
||||
conditions either of that published version or of any later version
|
||||
published by the Free Software Foundation. If the Library as you
|
||||
received it does not specify a version number of the GNU Lesser
|
||||
General Public License, you may choose any version of the GNU Lesser
|
||||
General Public License ever published by the Free Software Foundation.
|
||||
|
||||
If the Library as you received it specifies that a proxy can decide
|
||||
whether future versions of the GNU Lesser General Public License shall
|
||||
apply, that proxy's public statement of acceptance of any version is
|
||||
permanent authorization for you to choose that version for the
|
||||
Library.
|
|
@ -0,0 +1,127 @@
|
|||
2008-07-22 Samuel Abels <http://debain.org>
|
||||
|
||||
* all-over: Major cleanups/redesign, got gid of a lot of useless code.
|
||||
|
||||
2007-09-09 Samuel Abels <http://debain.org>
|
||||
|
||||
* Workflow/Job.py: Added is_completed(), get_waiting_tasks() and
|
||||
execute_task_from_id().
|
||||
|
||||
2007-09-03 Samuel Abels <http://debain.org>
|
||||
|
||||
* tests/*: Changed tests and directory structure to support testing data
|
||||
patterns. Added a test for the Block Data, Task To Task, Block Task To
|
||||
Sub-Workflow Decomposition, and Sub-Workflow Decomposition to Block Task
|
||||
patterns.
|
||||
* Tasks/Task.py, Storage/XmlReader: Support local task data in the form of
|
||||
properties (Task Data pattern).
|
||||
|
||||
2007-09-02 Samuel Abels <http://debain.org>
|
||||
|
||||
* *.py, Tasks/SubWorkflow.py: Support dynamic loading of sub-workflows. This
|
||||
also supports the Recursion pattern.
|
||||
* *.py: Support the assignment of attributes in the XML documents.
|
||||
* Tasks/Cancel.py: Support Explicit Termination.
|
||||
* Tasks/Trigger.py: Support Persistent Trigger.
|
||||
|
||||
2007-09-01 Samuel Abels <http://debain.org>
|
||||
|
||||
* tests: Support Interleaved Routing.
|
||||
* BranchNode.py, Tasks/*.py: Fixed several bugs related to task cancellation.
|
||||
* Tasks/CancelTask.py: Implemented the Cancel Task and Cancel Region
|
||||
patterns.
|
||||
* Tasks/CancelJob.py: Implemented the Cancel Case pattern.
|
||||
* Tasks/Join.py: Support the Cancel Multiple Instance Task and Complete
|
||||
Multiple Instance Task patterns.
|
||||
* all-over: Use task names instead of task references to address a task, in
|
||||
order to relax the strict ordering in which tasks needed to be instanciated.
|
||||
|
||||
2007-08-31 Samuel Abels <http://debain.org>
|
||||
|
||||
* Activities are now "Tasks", to be consistent with WorkflowPatterns
|
||||
terminology.
|
||||
* Tasks/Choose.py, Tasks/MultiChoice.py: Add support for the
|
||||
Deferred Choice pattern.
|
||||
* Tasks/AcquireMutex.py, Tasks/ReleaseMutex.py: Add support for Deferred
|
||||
Choice and Interleaved Parallel Routing.
|
||||
* Tasks/Gate.py: Support the Milestone pattern.
|
||||
* Tasks/Task.py: Relaxed constraint checking such that it is now allowed
|
||||
to create a task that has no outputs.
|
||||
* Tasks/StubTask.py: Removed. (obsolete)
|
||||
|
||||
2007-08-30 Samuel Abels <http://debain.org>
|
||||
|
||||
* Activities/Join.py: Add support for (Static|Cancelling|Dynamic) Partial
|
||||
Join for Multiple Instances.
|
||||
|
||||
2007-08-29 Samuel Abels <http://debain.org>
|
||||
|
||||
* Release 0.0.2
|
||||
|
||||
2007-08-29 Samuel Abels <http://debain.org>
|
||||
|
||||
* Activities/Join.py, Activities/Activity.py: Support General Synchronizing
|
||||
Merge.
|
||||
* Activities/Thread*.py: New, support for ThreadMerge and ThreadSplit
|
||||
patterns.
|
||||
|
||||
2007-08-27 Samuel Abels <http://debain.org>
|
||||
|
||||
* BranchNode.py, Activities/*.py: Implemented path prediction.
|
||||
|
||||
2007-08-26 Samuel Abels <http://debain.org>
|
||||
|
||||
* Activities/Join.py: Replaces Synchronization and Discriminator, now
|
||||
newly supporting: Cancelling Partial Join, Blocking Partial Join,
|
||||
Structured Partial Join, Cancelling Discriminator, Blocking Discriminator,
|
||||
Generalized AND-Join, and Acyclic Synchronizing Merge.
|
||||
* Activities/Synchronization.py: Replaced by Join.
|
||||
* Activities/Discriminator.py: Replaced by Join.
|
||||
* Activities/*.py: Removed the need to call completed_notify in every
|
||||
activity.
|
||||
* Activities/AddInstance.py: Removed.
|
||||
* Activities/Trigger.py: Replaces AddInstance.py.
|
||||
|
||||
2007-08-25 Samuel Abels <http://debain.org>
|
||||
|
||||
* BranchNode.py: New tree-based model for branches.
|
||||
* Branch.py: Removed in favor of BranchNode.py.
|
||||
* Activities/*.py: Replace the old Branch handling by the new branch nodes.
|
||||
* Condition.py: New.
|
||||
* Activities/*Choice.py, Storage/*.py: Use new condition class to replace
|
||||
old condition tuples. This allows comparisons to be made against attributes
|
||||
as well as (new) static values.
|
||||
* tests/*: Vastly improved all tests. Added an xml/patterns/ directory that
|
||||
will hold one test for each workflow pattern.
|
||||
* *.py: Lots of cleanups.
|
||||
* Activities/AddInstance.py: New, in preparation for the "Multiple Instances
|
||||
without a Priori Run-Time Knowledge" pattern. This is yet untested because
|
||||
there is no trigger activity yet, so the class is not yet useful.
|
||||
|
||||
2007-08-15 Samuel Abels <http://debain.org>
|
||||
|
||||
* Branch.py: Path tracking within branches now works properly.
|
||||
* Activities/Synchronization.py: Fixed several bugs with nested
|
||||
synchronizations.
|
||||
* Activities/*Synchronization.py: Merged the structured synchronization with
|
||||
the unstructured one.
|
||||
* *.py: Added support for persistence using Python's built in pickle module.
|
||||
* *.py: Lots of cleanups.
|
||||
|
||||
2007-08-13 Samuel Abels <http://debain.org>
|
||||
|
||||
* Activities/MultiInstance.py: Implement support for Multiple Instance patterns.
|
||||
* Job.py (execute_all): Now works through the branches in an ordered and
|
||||
predictable way.
|
||||
|
||||
2007-08-05 Samuel Abels <http://debain.org>
|
||||
|
||||
* Trackable.py: Got rid of useless signal/event mechanism.
|
||||
* Exception.py, Storage/*.py: Use the StorageException class.
|
||||
* *.py: Replace many assertions by more meaningful and descriptive
|
||||
exceptions.
|
||||
* *.py: Minor API documentation improvements.
|
||||
|
||||
2007-08-03 Samuel Abels <http://debain.org>
|
||||
|
||||
* Initial release 0.0.1
|
|
@ -0,0 +1,3 @@
|
|||
To install this package, run
|
||||
|
||||
sudo python setup.py install --prefix /usr/local
|
|
@ -0,0 +1,94 @@
|
|||
Spiff Workflow
|
||||
---------------
|
||||
This library is part of the Spiff platform.
|
||||
|
||||
Spiff Workflow is a library implementing a framework for workflows.
|
||||
It is based on http://www.workflowpatterns.com and implemented in pure Python.
|
||||
|
||||
|
||||
Supported Workflow Patterns
|
||||
----------------------------
|
||||
|
||||
Hint: The examples are located in tests/xml/spiff/.
|
||||
|
||||
Control-Flow Patterns:
|
||||
|
||||
1. Sequence [control-flow/sequence.xml]
|
||||
2. Parallel Split [control-flow/parallel_split.xml]
|
||||
3. Synchronization [control-flow/synchronization.xml]
|
||||
4. Exclusive Choice [control-flow/exclusive_choice.xml]
|
||||
5. Simple Merge [control-flow/simple_merge.xml]
|
||||
6. Multi-Choice [control-flow/multi_choice.xml]
|
||||
7. Structured Synchronizing Merge [control-flow/structured_synchronizing_merge.xml]
|
||||
8. Multi-Merge [control-flow/multi_merge.xml]
|
||||
9. Structured Discriminator [control-flow/structured_discriminator.xml]
|
||||
10. Arbitrary Cycles [control-flow/arbitrary_cycles.xml]
|
||||
11. Implicit Termination [control-flow/implicit_termination.xml]
|
||||
12. Multiple Instances without Synchronization [control-flow/multi_instance_without_synch.xml]
|
||||
13. Multiple Instances with a Priori Design-Time Knowledge [control-flow/multi_instance_with_a_priori_design_time_knowledge.xml]
|
||||
14. Multiple Instances with a Priori Run-Time Knowledge [control-flow/multi_instance_with_a_priori_run_time_knowledge.xml]
|
||||
15. Multiple Instances without a Priori Run-Time Knowledge [control-flow/multi_instance_without_a_priori.xml]
|
||||
16. Deferred Choice [control-flow/deferred_choice.xml]
|
||||
17. Interleaved Parallel Routing [control-flow/interleaved_parallel_routing.xml]
|
||||
18. Milestone [control-flow/milestone.xml]
|
||||
19. Cancel Task [control-flow/cancel_task.xml]
|
||||
20. Cancel Case [control-flow/cancel_case.xml]
|
||||
|
||||
22. Recursion [control-flow/recursion.xml]
|
||||
23. Transient Trigger [control-flow/transient_trigger.xml]
|
||||
24. Persistent Trigger [control-flow/persistent_trigger.xml]
|
||||
25. Cancel Region [control-flow/cancel_region.xml]
|
||||
26. Cancel Multiple Instance Task [control-flow/cancel_multi_instance_task.xml]
|
||||
27. Complete Multiple Instance Task [control-flow/complete_multiple_instance_activity.xml]
|
||||
28. Blocking Discriminator [control-flow/blocking_discriminator.xml]
|
||||
29. Cancelling Discriminator [control-flow/cancelling_discriminator.xml]
|
||||
30. Structured Partial Join [control-flow/structured_partial_join.xml]
|
||||
31. Blocking Partial Join [control-flow/blocking_partial_join.xml]
|
||||
32. Cancelling Partial Join [control-flow/cancelling_partial_join.xml]
|
||||
33. Generalized AND-Join [control-flow/generalized_and_join.xml]
|
||||
34. Static Partial Join for Multiple Instances [control-flow/static_partial_join_for_multi_instance.xml]
|
||||
35. Cancelling Partial Join for Multiple Instances [control-flow/cancelling_partial_join_for_multi_instance.xml]
|
||||
36. Dynamic Partial Join for Multiple Instances [control-flow/dynamic_partial_join_for_multi_instance.xml]
|
||||
37. Acyclic Synchronizing Merge [control-flow/acyclic_synchronizing_merge.xml]
|
||||
38. General Synchronizing Merge [control-flow/general_synchronizing_merge.xml]
|
||||
39. Critical Section [control-flow/critical_section.xml]
|
||||
40. Interleaved Routing [control-flow/interleaved_routing.xml]
|
||||
41. Thread Merge [control-flow/thread_merge.xml]
|
||||
42. Thread Split [control-flow/thread_split.xml]
|
||||
43. Explicit Termination [control-flow/explicit_termination.xml]
|
||||
|
||||
|
||||
Workflow Data Patterns:
|
||||
|
||||
1. Task Data [data/task_data.xml]
|
||||
2. Block Data [data/block_data.xml]
|
||||
9. Task to Task [data/task_to_task.xml]
|
||||
10. Block Task to Sub-Workflow Decomposition [data/block_to_subworkflow.xml]
|
||||
11. Sub-Workflow Decomposition to Block Task [data/subworkflow_to_block.xml]
|
||||
|
||||
|
||||
Contact
|
||||
--------
|
||||
Mailing List: http://groups.google.com/group/spiff-devel/
|
||||
|
||||
|
||||
Dependencies
|
||||
-------------
|
||||
(none)
|
||||
|
||||
|
||||
Usage
|
||||
------
|
||||
API documentation is embedded into the Spiff Workflow source code and
|
||||
currently not yet available elsewhere. Other developer documentation has not
|
||||
yet been written.
|
||||
|
||||
If you need more help, please drop by our mailing list. You might actually
|
||||
make someone write the missing pieces of documentation.
|
||||
|
||||
##############################
|
||||
from Workflow import *
|
||||
|
||||
wf = Workflow()
|
||||
...
|
||||
##############################
|
|
@ -0,0 +1,7 @@
|
|||
* Write docs.
|
||||
* Support XML export.
|
||||
* Create a Gtk widget for designing the XML.
|
||||
* As soon as it is possible to trigger an action twice without
|
||||
creating another branch (some kind of asynchronous notification,
|
||||
perhaps), make sure to test the generalized AND-join with that
|
||||
in xml/patterns/generalized_and_join.xml.
|
|
@ -0,0 +1,46 @@
|
|||
HANDBOOKS=en
|
||||
IN_BASENAME=spiff_workflow
|
||||
OUT_BASENAME=spiff_workflow
|
||||
|
||||
all: handbook apidocs
|
||||
|
||||
figures:
|
||||
# Generate GraphViz figures. Unfortunately, pdflatex does not work when
|
||||
# including these figures in PDF, tex, or PS format, so we are stuck with
|
||||
# png.
|
||||
for FILE in figures/*.dot; do \
|
||||
DESTFILE=`echo $$FILE | sed 's/.dot\$$//'`.png; \
|
||||
dot -Tpng -Gmargin=0 $$FILE -o $$DESTFILE; \
|
||||
done
|
||||
|
||||
#pdf: figures
|
||||
pdf:
|
||||
# Generate a latex file that defines the version number.
|
||||
VERSION=`python ../setup.py --version`; \
|
||||
echo '% This file is automatically generated; do not edit' > version.tex; \
|
||||
echo "\\\newcommand{\\\productversion}{$$VERSION }" >> version.tex
|
||||
|
||||
# Run each call of pdflatex twice, required to resolve references.
|
||||
for LOCALE in $(HANDBOOKS); do \
|
||||
FILE=$(IN_BASENAME).$$LOCALE; \
|
||||
pdflatex -halt-on-error $$FILE.tex && pdflatex $$FILE.tex; \
|
||||
rm $$FILE.aux $$FILE.log $$FILE.out $$FILE.toc; \
|
||||
[ ! -e handbook/$$LOCALE ] && mkdir -p handbook/$$LOCALE/; \
|
||||
mv $$FILE.pdf handbook/$$LOCALE/$(OUT_BASENAME).$$LOCALE.pdf; \
|
||||
done
|
||||
|
||||
handbook: pdf
|
||||
for LOCALE in $(HANDBOOKS); do \
|
||||
FILE=$(IN_BASENAME).$$LOCALE; \
|
||||
latex2html -nonavigation -toc_depth 5 -split 0 -html_version 4.0,unicode -mkdir -dir handbook/$$LOCALE/ $$FILE.tex; \
|
||||
done
|
||||
|
||||
apidocs:
|
||||
python mkapidoc.py
|
||||
|
||||
clean:
|
||||
rm -Rf api handbook
|
||||
rm -f *.aux *.log *.out *.pdf *.toc figures/*.png
|
||||
|
||||
publish: handbook apidocs
|
||||
scp -r handbook/* api/* kd10243@goliath.speedpartner.de:/home/kd10243/spiff.debain.org/static/docs/spiff_workflow/
|
|
@ -0,0 +1,6 @@
|
|||
\begin{tabular}{ll}
|
||||
{\bf Google Groups:} & http://groups.google.com/group/spiff-devel/ \\
|
||||
{\bf Bug tracker:} & http://code.google.com/p/spiff/issues/list \\
|
||||
{\bf Phone:} & +49 176 611 33083 \\
|
||||
{\bf Jabber:} & knipknap@jabber.org
|
||||
\end{tabular}
|
|
@ -0,0 +1,32 @@
|
|||
#!/usr/bin/env python
|
||||
# Generates the API documentation.
|
||||
import os, re, sys
|
||||
|
||||
doc_dir = 'api'
|
||||
doc_file = os.path.join(doc_dir, 'Spiff_Workflow.py')
|
||||
files = ['../src/SpiffWorkflow/Job.py',
|
||||
'../src/SpiffWorkflow/Tasks/Task.py',
|
||||
'../src/SpiffWorkflow/Tasks/Join.py'] # Order matters - can't resolve inheritance otherwise.
|
||||
classes = [os.path.splitext(os.path.basename(file))[0] for file in files]
|
||||
|
||||
# Concatenate the content of all files into one file.
|
||||
if not os.path.exists(doc_dir):
|
||||
os.makedirs(doc_dir)
|
||||
remove_re = re.compile(r'^from (' + '|'.join(classes) + r') * import .*')
|
||||
fp_out = open(doc_file, 'w')
|
||||
for file in files:
|
||||
fp_in = open(file, 'r')
|
||||
for line in fp_in:
|
||||
if not remove_re.match(line):
|
||||
fp_out.write(line)
|
||||
fp_in.close()
|
||||
fp_out.close()
|
||||
|
||||
os.system('epydoc ' + ' '.join(['--html',
|
||||
'--parse-only',
|
||||
'--no-private',
|
||||
'--no-source',
|
||||
'--no-frames',
|
||||
'--inheritance=grouped',
|
||||
'-v',
|
||||
'-o %s' % doc_dir, doc_file]))
|
|
@ -0,0 +1,65 @@
|
|||
\input{spiff_workflow.tex} % Import common styles.
|
||||
\fancyfoot[C]{Page \thepage}
|
||||
\title{\productname\ Release \productversion\\
|
||||
User Documentation\\
|
||||
\vspace{5 mm}
|
||||
\large Generic Access Lists for Python}
|
||||
\author{Samuel Abels}
|
||||
|
||||
\begin{document}
|
||||
\maketitle
|
||||
\tableofcontents
|
||||
|
||||
\newpage
|
||||
\section{Introduction}
|
||||
\subsection{Why \productname?}
|
||||
|
||||
\product is a library for implementing workflows.
|
||||
|
||||
\subsection{Legal Information}
|
||||
|
||||
\product and this handbook are distributed under the terms and conditions
|
||||
of the GNU GPL (General Public License) Version 2. You should have received
|
||||
a copy of the GPL along with \product. If you did not, you may read it here:
|
||||
|
||||
\vspace{1em}
|
||||
\url{http://www.gnu.org/licenses/gpl-2.0.txt}
|
||||
\vspace{1em}
|
||||
|
||||
If this license does not meet your requirements you may contact us under
|
||||
the points of contact listed in the following section. Please let us know
|
||||
why you need a different license - perhaps we may work out a solution
|
||||
that works for either of us.
|
||||
|
||||
|
||||
\subsection{Contact Information \& Feedback}
|
||||
|
||||
If you spot any errors, or have ideas for improving \product or this
|
||||
documentation, your suggestions are gladly accepted.
|
||||
We offer the following contact options: \\
|
||||
|
||||
\input{contact.tex}
|
||||
|
||||
\newpage
|
||||
\section{Quick Overview}
|
||||
\subsection{Initialisation}
|
||||
|
||||
Before using \product it needs to be initialized.
|
||||
The following example shows how this is done:
|
||||
|
||||
\begin{lstlisting}
|
||||
from sqlalchemy import create_engine
|
||||
from Guard import *
|
||||
db = create_engine('mysql://user:pass@localhost/guard_name')
|
||||
guard = DB(db)
|
||||
guard.install()
|
||||
\end{lstlisting}
|
||||
|
||||
|
||||
\subsection{\label{intro:resources}Creating Resources}
|
||||
|
||||
In \product, ...
|
||||
|
||||
%\mygraph{workflow1}{A workflow}
|
||||
|
||||
\end{document}
|
|
@ -0,0 +1,96 @@
|
|||
\documentclass[a4paper,oneside,11pt]{scrartcl}
|
||||
\usepackage[left=2cm,right=2cm,top=2.0cm,bottom=2.5cm,includeheadfoot]{geometry}
|
||||
\usepackage[utf8]{inputenc} % Replace "utf8" by "latin1" if your editor sucks.
|
||||
\usepackage{fancyhdr} % Better header and footer support.
|
||||
\usepackage{titlesec} % Alternative section titles.
|
||||
\usepackage{amssymb} % Math support.
|
||||
\usepackage{amsmath} % Math declarations.
|
||||
|
||||
% GraphViz support
|
||||
\usepackage{graphicx}
|
||||
\usepackage[x11names, rgb]{xcolor}
|
||||
\usepackage{tikz} % Drawing shapes.
|
||||
\usepackage{caption} % Captions for graphviz figures.
|
||||
\usetikzlibrary{snakes,arrows,shapes}
|
||||
\newcommand{\mygraph}[2]{%
|
||||
\vspace{1em}
|
||||
\includegraphics[width=15cm]{figures/#1.png}
|
||||
\captionof{figure}{#2}
|
||||
\vspace{1em}
|
||||
}
|
||||
|
||||
% Variables.
|
||||
\input{version.tex}
|
||||
\newcommand{\productname}{Spiff Workflow}
|
||||
\newcommand{\product}{{\it \productname} }
|
||||
|
||||
% Make references clickable.
|
||||
\usepackage[colorlinks,hyperindex]{hyperref}
|
||||
\hypersetup{%
|
||||
pdftitle = {\productname\ Version \productversion},
|
||||
pdfkeywords = {spiff workflow},
|
||||
pdfauthor = {Samuel Abels},
|
||||
colorlinks = true,
|
||||
%linkcolor = blue,
|
||||
}
|
||||
|
||||
% Initialize headers and footers.
|
||||
\pagestyle{fancy} % Use fancyhdr to render page headers/footers.
|
||||
\fancyhf{} % Clear out old header/footer definition.
|
||||
|
||||
% Header
|
||||
%\fancyhead[C]{\bfseries \productname}
|
||||
\fancyhead[L]{\leftmark}
|
||||
\fancyhead[R]{\MakeUppercase{\rightmark}}
|
||||
\renewcommand{\headrulewidth}{0.5pt}
|
||||
|
||||
% Footer
|
||||
\fancyfoot[C]{Page \thepage}
|
||||
\renewcommand{\footrulewidth}{0.5pt}
|
||||
|
||||
% Enumerate using letters.
|
||||
\renewcommand{\labelenumi}{\alph{enumi})}
|
||||
|
||||
% Set source code options.
|
||||
\usepackage{listings}
|
||||
\lstset{language=python}
|
||||
\lstset{commentstyle=\textit}
|
||||
\lstset{showstringspaces=false}
|
||||
\lstset{aboveskip=.1in,belowskip=.1in,xleftmargin=2em,basewidth=5pt}
|
||||
|
||||
% Do not indent paragraphs.
|
||||
\parindent=0em
|
||||
|
||||
% Preformatted, indented text.
|
||||
\usepackage{verbatim}
|
||||
\makeatletter
|
||||
\newenvironment{indentverb}
|
||||
{\def\verbatim@processline{%
|
||||
\hspace*{2em}\the\verbatim@line\par}%
|
||||
\verbatim}
|
||||
{\endverbatim}
|
||||
\makeatother
|
||||
|
||||
% Title
|
||||
\title{\productname\ Release \productversion\\
|
||||
User Documentation\\
|
||||
\vspace{5 mm}
|
||||
\large Generic Access Lists for Python}
|
||||
|
||||
% Hint boxes.
|
||||
\usepackage{color}
|
||||
\definecolor{nb}{gray}{.90}
|
||||
\newcommand{\hint}[1]{
|
||||
\begin{center}
|
||||
\colorbox{nb}{
|
||||
\begin{tabular}{ll}
|
||||
\Large ! &
|
||||
\begin{minipage}{.92\linewidth}{
|
||||
\vspace{2mm}
|
||||
\sf #1
|
||||
\vspace{2mm}
|
||||
}\end{minipage}
|
||||
\end{tabular}
|
||||
}
|
||||
\end{center}
|
||||
}
|
|
@ -0,0 +1,2 @@
|
|||
% This file is automatically generated; do not edit
|
||||
\newcommand{\productversion}{0.1.0 }
|
|
@ -0,0 +1,37 @@
|
|||
from setuptools import setup, find_packages
|
||||
from os.path import dirname, join
|
||||
srcdir = join(dirname(__file__), 'src')
|
||||
setup(name = 'Spiff Workflow',
|
||||
version = '0.1.0',
|
||||
description = 'A workflow framework based on www.workflowpatterns.com',
|
||||
long_description = \
|
||||
"""
|
||||
Spiff Workflow is a library implementing workflows in pure Python.
|
||||
It was designed to provide a clean API, and tries to be very easy to use.
|
||||
|
||||
You can find a list of supported workflow patterns in the `README file`_
|
||||
included with the package.
|
||||
|
||||
WARNING! Use in a production environment is NOT RECOMMENDED at this time -
|
||||
this release is meant for development only. Don't blame us if something breaks
|
||||
because of this software!
|
||||
|
||||
.. _README file: http://spiff.googlecode.com/svn/trunk/libs/Workflow/README
|
||||
""",
|
||||
author = 'Samuel Abels',
|
||||
author_email = 'cheeseshop.python.org@debain.org',
|
||||
license = 'lGPLv2',
|
||||
package_dir = {'': srcdir},
|
||||
packages = [p for p in find_packages(srcdir)],
|
||||
requires = ['sqlalchemy'],
|
||||
keywords = 'spiff guard acl acls security authentication object storage',
|
||||
url = 'http://code.google.com/p/spiff/',
|
||||
classifiers = [
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)',
|
||||
'Programming Language :: Python',
|
||||
'Topic :: Other/Nonlisted Topic',
|
||||
'Topic :: Software Development :: Libraries',
|
||||
'Topic :: Software Development :: Libraries :: Python Modules'
|
||||
])
|
|
@ -0,0 +1,29 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
class WorkflowException(Exception):
|
||||
def __init__(self, sender, error):
|
||||
"""
|
||||
Standard exception class.
|
||||
|
||||
sender -- the task that threw the exception.
|
||||
error -- string
|
||||
"""
|
||||
Exception.__init__(self, '%s: %s' % (sender.get_name(), error))
|
||||
self.sender = sender
|
||||
|
||||
|
||||
class StorageException(Exception):
|
||||
pass
|
|
@ -0,0 +1,195 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import Tasks
|
||||
from mutex import mutex
|
||||
from Trackable import Trackable
|
||||
from TaskInstance import TaskInstance
|
||||
|
||||
class Job(Trackable):
|
||||
"""
|
||||
This class implements the engine that executes a workflow.
|
||||
It is a essentially a facility for managing all branches.
|
||||
A Job is also the place that holds the attributes of a running workflow.
|
||||
"""
|
||||
|
||||
def __init__(self, workflow, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
Trackable.__init__(self)
|
||||
assert workflow is not None
|
||||
self.workflow = workflow
|
||||
self.attributes = {}
|
||||
self.outer_job = kwargs.get('parent', self)
|
||||
self.locks = {}
|
||||
self.last_node = None
|
||||
self.task_tree = TaskInstance(self, Tasks.Task(workflow, 'Root'))
|
||||
self.success = True
|
||||
self.debug = False
|
||||
|
||||
# Prevent the root node from being executed.
|
||||
self.task_tree.state = TaskInstance.COMPLETED
|
||||
start = self.task_tree._add_child(workflow.start)
|
||||
|
||||
workflow.start._predict(start)
|
||||
if not kwargs.has_key('parent'):
|
||||
start.task._update_state(start)
|
||||
#start.dump()
|
||||
|
||||
|
||||
def is_completed(self):
|
||||
"""
|
||||
Returns True if the entire Job is completed, False otherwise.
|
||||
"""
|
||||
mask = TaskInstance.NOT_FINISHED_MASK
|
||||
iter = TaskInstance.Iterator(self.task_tree, mask)
|
||||
try:
|
||||
next = iter.next()
|
||||
except:
|
||||
# No waiting nodes found.
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def _get_waiting_tasks(self):
|
||||
waiting = TaskInstance.Iterator(self.task_tree, TaskInstance.WAITING)
|
||||
return [w for w in waiting]
|
||||
|
||||
|
||||
def _instance_completed_notify(self, instance):
|
||||
if instance.get_name() == 'End':
|
||||
self.attributes.update(instance.get_attributes())
|
||||
# Update the state of every WAITING node.
|
||||
for node in self._get_waiting_tasks():
|
||||
node.task._update_state(node)
|
||||
if self.signal_subscribers('completed') == 0:
|
||||
# Since is_completed() is expensive it makes sense to bail
|
||||
# out if calling it is not necessary.
|
||||
return
|
||||
if self.is_completed():
|
||||
self.signal_emit('completed', self)
|
||||
|
||||
|
||||
def get_attribute(self, name, default = None):
|
||||
"""
|
||||
Returns the value of the attribute with the given name, or the given
|
||||
default value if the attribute does not exist.
|
||||
|
||||
name -- an attribute name (string)
|
||||
default -- the default value that is returned if the attribute does
|
||||
not exist.
|
||||
"""
|
||||
return self.attributes.get(name, default)
|
||||
|
||||
|
||||
def get_mutex(self, name):
|
||||
if not self.locks.has_key(name):
|
||||
self.locks[name] = mutex()
|
||||
return self.locks[name]
|
||||
|
||||
|
||||
def cancel(self, success = False):
|
||||
"""
|
||||
Cancels all open tasks in the job.
|
||||
|
||||
success -- whether the Job should be marked as successfully completed
|
||||
vs. unsuccessful
|
||||
"""
|
||||
self.success = success
|
||||
cancel = []
|
||||
mask = TaskInstance.NOT_FINISHED_MASK
|
||||
for node in TaskInstance.Iterator(self.task_tree, mask):
|
||||
cancel.append(node)
|
||||
for node in cancel:
|
||||
node.cancel()
|
||||
|
||||
|
||||
def get_task_from_name(self, name):
|
||||
return self.workflow.tasks[name]
|
||||
|
||||
|
||||
def get_tasks(self, state = TaskInstance.ANY_MASK):
|
||||
"""
|
||||
Returns a list of objects that each reference a task with the given
|
||||
state.
|
||||
"""
|
||||
return [t for t in TaskInstance.Iterator(self.task_tree, state)]
|
||||
|
||||
|
||||
def complete_task_from_id(self, node_id):
|
||||
"""
|
||||
Runs the given task.
|
||||
"""
|
||||
if node_id is None:
|
||||
raise WorkflowException(self.workflow, 'node_id is None')
|
||||
for node in self.task_tree:
|
||||
if node.id == node_id:
|
||||
return node.complete()
|
||||
msg = 'A node with the given node_id (%s) was not found' % node_id
|
||||
raise WorkflowException(self.workflow, msg)
|
||||
|
||||
|
||||
def complete_next(self, pick_up = True):
|
||||
"""
|
||||
Runs the next task.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
pick_up -- when True, this method attempts to choose the next task
|
||||
not by searching beginning at the root, but by searching
|
||||
from the position at which the last call of complete_next()
|
||||
left off.
|
||||
"""
|
||||
# Try to pick up where we left off.
|
||||
blacklist = []
|
||||
if pick_up and self.last_node is not None:
|
||||
try:
|
||||
iter = TaskInstance.Iterator(self.last_node, TaskInstance.READY)
|
||||
next = iter.next()
|
||||
except:
|
||||
next = None
|
||||
self.last_node = None
|
||||
if next is not None:
|
||||
if next.complete():
|
||||
self.last_node = next
|
||||
return True
|
||||
blacklist.append(next)
|
||||
|
||||
# Walk through all waiting tasks.
|
||||
for node in TaskInstance.Iterator(self.task_tree, TaskInstance.READY):
|
||||
for blacklisted_node in blacklist:
|
||||
if node._is_descendant_of(blacklisted_node):
|
||||
continue
|
||||
if node.complete():
|
||||
self.last_node = node
|
||||
return True
|
||||
blacklist.append(node)
|
||||
return False
|
||||
|
||||
|
||||
def complete_all(self, pick_up = True):
|
||||
"""
|
||||
Runs all branches until completion.
|
||||
"""
|
||||
while self.complete_next(pick_up):
|
||||
pass
|
||||
|
||||
|
||||
def get_dump(self):
|
||||
return self.task_tree.get_dump()
|
||||
|
||||
|
||||
def dump(self):
|
||||
return self.task_tree.dump()
|
|
@ -0,0 +1,19 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
class Attrib(object):
|
||||
def __init__(self, name):
|
||||
self.name = name
|
|
@ -0,0 +1,45 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Operator import Operator
|
||||
|
||||
class Equal(Operator):
|
||||
"""
|
||||
This class represents the EQUAL operator.
|
||||
"""
|
||||
def _matches(self, task):
|
||||
values = self._get_values(task)
|
||||
last = values[0]
|
||||
for value in values:
|
||||
if value != last:
|
||||
return False
|
||||
last = value
|
||||
return True
|
|
@ -0,0 +1,46 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Operator import Operator
|
||||
|
||||
class GreaterThan(Operator):
|
||||
"""
|
||||
This class represents the GREATER THAN operator.
|
||||
"""
|
||||
def __init__(self, left, right):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
Operator.__init__(self, left, right)
|
||||
|
||||
def _matches(self, task):
|
||||
left, right = self._get_values(task)
|
||||
return int(left) > int(right)
|
|
@ -0,0 +1,46 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Operator import Operator
|
||||
|
||||
class LessThan(Operator):
|
||||
"""
|
||||
This class represents the LESS THAN operator.
|
||||
"""
|
||||
def __init__(self, left, right):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
Operator.__init__(self, left, right)
|
||||
|
||||
def _matches(self, task):
|
||||
left, right = self._get_values(task)
|
||||
return int(left) < int(right)
|
|
@ -0,0 +1,50 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import re
|
||||
from Operator import Operator
|
||||
|
||||
class Match(Operator):
|
||||
"""
|
||||
This class represents the regular expression match operator.
|
||||
"""
|
||||
def __init__(self, regex, *args):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
Operator.__init__(self, *args)
|
||||
self.regex = re.compile(regex)
|
||||
|
||||
def _matches(self, task):
|
||||
for value in self._get_values(task):
|
||||
if not self.regex.search(value):
|
||||
return False
|
||||
return True
|
|
@ -0,0 +1,45 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Operator import Operator
|
||||
|
||||
class NotEqual(Operator):
|
||||
"""
|
||||
This class represents the NOT EQUAL operator.
|
||||
"""
|
||||
def _matches(self, task):
|
||||
values = self._get_values(task)
|
||||
last = values[0]
|
||||
for value in values:
|
||||
if value != last:
|
||||
return True
|
||||
last = value
|
||||
return False
|
|
@ -0,0 +1,47 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Attrib import Attrib
|
||||
|
||||
def valueof(scope, op):
|
||||
if op is None:
|
||||
return None
|
||||
elif isinstance(op, Attrib):
|
||||
return scope.get_attribute(op.name)
|
||||
else:
|
||||
return op
|
||||
|
||||
class Operator(object):
|
||||
def __init__(self, *args):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
kwargs -- must contain one of left_attribute/left and one of
|
||||
right_attribute/right.
|
||||
"""
|
||||
if len(args) == 0:
|
||||
raise TypeException("Too few arguments")
|
||||
self.args = args
|
||||
|
||||
|
||||
def _get_values(self, task):
|
||||
values = []
|
||||
for arg in self.args:
|
||||
values.append(unicode(valueof(task, arg)))
|
||||
return values
|
||||
|
||||
|
||||
def _matches(self, task):
|
||||
raise Exception("Abstract class, do not call")
|
|
@ -0,0 +1,11 @@
|
|||
from Attrib import Attrib
|
||||
from Equal import Equal
|
||||
from NotEqual import NotEqual
|
||||
from Match import Match
|
||||
from LessThan import LessThan
|
||||
from GreaterThan import GreaterThan
|
||||
from Operator import valueof
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,232 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import sys
|
||||
import os.path
|
||||
import SpiffWorkflow.Storage
|
||||
import sqlalchemy.orm as orm
|
||||
from sqlalchemy import *
|
||||
from WorkflowInfo import WorkflowInfo
|
||||
from JobInfo import JobInfo
|
||||
from TaskInfo import TaskInfo
|
||||
|
||||
class DB(object):
|
||||
attrib_type_int, attrib_type_bool, attrib_type_string = range(3)
|
||||
|
||||
def __init__(self, db):
|
||||
"""
|
||||
Instantiates a new DBReader.
|
||||
|
||||
@type db: object
|
||||
@param db: An sqlalchemy database connection.
|
||||
@rtype: DB
|
||||
@return: The new instance.
|
||||
"""
|
||||
self.db = db
|
||||
self.db_metadata = MetaData(self.db)
|
||||
self.session_maker = orm.sessionmaker(bind = self.db,
|
||||
autoflush = True,
|
||||
transactional = True)
|
||||
self.session = self.session_maker()
|
||||
self.xml_parser = None
|
||||
self._table_prefix = 'workflow_'
|
||||
self._table_list = []
|
||||
self._table_map = {}
|
||||
self._initialized = False
|
||||
self.__update_table_names()
|
||||
|
||||
|
||||
def __add_table(self, table):
|
||||
"""
|
||||
Adds a new table to the internal table list.
|
||||
|
||||
@type table: Table
|
||||
@param table: An sqlalchemy table.
|
||||
"""
|
||||
pfx = self._table_prefix
|
||||
self._table_list.append(table)
|
||||
self._table_map[table.name[len(pfx):]] = table
|
||||
|
||||
|
||||
def __update_table_names(self):
|
||||
"""
|
||||
Adds all tables to the internal table list.
|
||||
"""
|
||||
pfx = self._table_prefix
|
||||
self._table_list = []
|
||||
|
||||
# Workflow table.
|
||||
table = Table(pfx + 'workflow',
|
||||
self.db_metadata,
|
||||
Column('id', Integer, primary_key = True),
|
||||
Column('handle', String(200), unique = True),
|
||||
Column('name', String(50)),
|
||||
Column('xml', Text),
|
||||
mysql_engine = 'INNODB')
|
||||
if not self._initialized:
|
||||
mapper = orm.mapper(WorkflowInfo, table)
|
||||
self.__add_table(table)
|
||||
|
||||
# Job table.
|
||||
table = Table(pfx + 'job',
|
||||
self.db_metadata,
|
||||
Column('id', Integer, primary_key = True),
|
||||
Column('workflow_id', Integer, index = True),
|
||||
Column('status', String(50)),
|
||||
Column('last_change', DateTime()),
|
||||
Column('instance', PickleType()),
|
||||
ForeignKeyConstraint(['workflow_id'],
|
||||
[pfx + 'workflow.id'],
|
||||
ondelete = 'CASCADE'),
|
||||
mysql_engine = 'INNODB')
|
||||
if not self._initialized:
|
||||
mapper = orm.mapper(JobInfo,
|
||||
table,
|
||||
properties = {
|
||||
'instance': orm.deferred(table.c.instance)
|
||||
})
|
||||
self.__add_table(table)
|
||||
|
||||
# Task table.
|
||||
table = Table(pfx + 'task',
|
||||
self.db_metadata,
|
||||
Column('id', Integer, primary_key = True),
|
||||
Column('job_id', Integer, index = True),
|
||||
Column('node_id', Integer, index = True),
|
||||
Column('name', String(230)),
|
||||
Column('status', Integer),
|
||||
Column('last_change', DateTime()),
|
||||
ForeignKeyConstraint(['job_id'],
|
||||
[pfx + 'job.id'],
|
||||
ondelete = 'CASCADE'),
|
||||
mysql_engine = 'INNODB')
|
||||
if not self._initialized:
|
||||
mapper = orm.mapper(TaskInfo, table)
|
||||
self.__add_table(table)
|
||||
|
||||
self._initialized = True
|
||||
|
||||
|
||||
def install(self):
|
||||
"""
|
||||
Installs (or upgrades) database tables.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
for table in self._table_list:
|
||||
table.create(checkfirst = True)
|
||||
return True
|
||||
|
||||
|
||||
def uninstall(self):
|
||||
"""
|
||||
Drops all tables from the database. Use with care.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
self.db_metadata.drop_all()
|
||||
return True
|
||||
|
||||
|
||||
def clear_database(self):
|
||||
"""
|
||||
Drops the content of any database table used by this library.
|
||||
Use with care.
|
||||
|
||||
Wipes out everything, including sections, actions, resources and acls.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
delete = self._table_map['workflow'].delete()
|
||||
result = delete.execute()
|
||||
assert result is not None
|
||||
|
||||
delete = self._table_map['job'].delete()
|
||||
result = delete.execute()
|
||||
assert result is not None
|
||||
|
||||
delete = self._table_map['task'].delete()
|
||||
result = delete.execute()
|
||||
assert result is not None
|
||||
return True
|
||||
|
||||
|
||||
def debug(self, debug = True):
|
||||
"""
|
||||
Enable/disable debugging.
|
||||
|
||||
@type debug: Boolean
|
||||
@param debug: True to enable debugging.
|
||||
"""
|
||||
self.db.debug = debug
|
||||
|
||||
|
||||
def set_table_prefix(self, prefix):
|
||||
"""
|
||||
Define a table prefix. Default is 'warehouse_'.
|
||||
|
||||
@type prefix: string
|
||||
@param prefix: The new prefix.
|
||||
"""
|
||||
self._table_prefix = prefix
|
||||
self.__update_table_names()
|
||||
|
||||
|
||||
def get_table_prefix(self):
|
||||
"""
|
||||
Returns the current database table prefix.
|
||||
|
||||
@rtype: string
|
||||
@return: The current prefix.
|
||||
"""
|
||||
return self._table_prefix
|
||||
|
||||
|
||||
def __get_xml_parser(self):
|
||||
if self.xml_parser is None:
|
||||
self.xml_parser = SpiffWorkflow.Storage.XmlParser()
|
||||
return self.xml_parser
|
||||
|
||||
|
||||
def get_workflow_info(self, **filter):
|
||||
return [r for r in self.session.query(WorkflowInfo).filter_by(**filter)]
|
||||
|
||||
|
||||
def get_job_info(self, **filter):
|
||||
return [r for r in self.session.query(JobInfo).filter_by(**filter)]
|
||||
|
||||
|
||||
def get_task_info(self, **filter):
|
||||
return [r for r in self.session.query(TaskInfo).filter_by(**filter)]
|
||||
|
||||
|
||||
def delete(self, object):
|
||||
if object is None:
|
||||
raise Exception('object argument is None')
|
||||
self.session.delete(object)
|
||||
self.session.commit()
|
||||
#self.session.flush()
|
||||
|
||||
|
||||
def save(self, object):
|
||||
if object is None:
|
||||
raise Exception('object argument is None')
|
||||
result = self.session.save_or_update(object)
|
||||
self.session.commit()
|
||||
#self.session.flush()
|
||||
return result
|
|
@ -0,0 +1,181 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import sys
|
||||
import os.path
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
|
||||
from sqlalchemy import *
|
||||
from Exceptions import WorkflowServerException
|
||||
from DB import DB
|
||||
from JobInfo import JobInfo
|
||||
from TaskInfo import TaskInfo
|
||||
from SpiffWorkflow.Storage import XmlReader
|
||||
from SpiffWorkflow.Job import Job
|
||||
|
||||
class Driver(object):
|
||||
"""
|
||||
A driver provides an API for storing and loading workflows, receiving
|
||||
information regarding running Jobs, and for driving the workflow
|
||||
execution.
|
||||
"""
|
||||
|
||||
def __init__(self, db):
|
||||
"""
|
||||
Instantiates a new Driver.
|
||||
|
||||
@type db: object
|
||||
@param db: An sqlalchemy database connection.
|
||||
@rtype: Driver
|
||||
@return: The new instance.
|
||||
"""
|
||||
self.db = DB(db)
|
||||
self.xmlreader = XmlReader()
|
||||
|
||||
|
||||
def install(self):
|
||||
"""
|
||||
Installs (or upgrades) the workflow server.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
return self.db.install()
|
||||
|
||||
|
||||
def uninstall(self):
|
||||
"""
|
||||
Uninstall the workflow engine. This also permanently removes all data,
|
||||
history, and running jobs. Use with care.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
return self.db.uninstall()
|
||||
|
||||
|
||||
def get_workflow_info(self, **filter):
|
||||
"""
|
||||
Returns the WorkflowInfo objects that match the given criteria.
|
||||
|
||||
@rtype: [WorkflowInfo]
|
||||
@return: A list of WorkflowInfo objects from the database.
|
||||
"""
|
||||
return self.db.get_workflow_info(**filter)
|
||||
|
||||
|
||||
def save_workflow_info(self, object):
|
||||
"""
|
||||
Store the WorkflowInfo in the database.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
return self.db.save(object)
|
||||
|
||||
|
||||
def delete_workflow_info(self, object):
|
||||
"""
|
||||
Delete the WorkflowInfo from the database.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
return self.db.delete(object)
|
||||
|
||||
|
||||
def create_job(self, workflow_info):
|
||||
"""
|
||||
Creates an instance of the given workflow.
|
||||
|
||||
@rtype: JobInfo
|
||||
@return: The JobInfo for the newly created workflow instance.
|
||||
"""
|
||||
if workflow_info is None:
|
||||
raise WorkflowServerException('workflow_info argument is None')
|
||||
if workflow_info.id is None:
|
||||
raise WorkflowServerException('workflow_info must be saved first')
|
||||
workflow = self.xmlreader.parse_string(workflow_info.xml)
|
||||
job = Job(workflow[0])
|
||||
job_info = JobInfo(workflow_info.id, job)
|
||||
self.__save_job_info(job_info)
|
||||
return job_info
|
||||
|
||||
|
||||
def get_job_info(self, **filter):
|
||||
"""
|
||||
Returns the workflow instances that matches the given criteria.
|
||||
|
||||
@rtype: [JobInfo]
|
||||
@return: A list of JobInfo objects from the database.
|
||||
"""
|
||||
return self.db.get_job_info(**filter)
|
||||
|
||||
|
||||
def __save_job_info(self, job_info):
|
||||
self.db.save(job_info)
|
||||
for node in job_info.instance.branch_tree:
|
||||
task_info = self.get_task_info(job_id = job_info.id,
|
||||
node_id = node.id)
|
||||
if len(task_info) == 1:
|
||||
task_info = task_info[0]
|
||||
elif len(task_info) == 0:
|
||||
task_info = TaskInfo(job_info.id, node)
|
||||
else:
|
||||
raise WorkflowServerException('More than one task found')
|
||||
task_info.status = node.state
|
||||
self.db.save(task_info)
|
||||
|
||||
|
||||
def delete_job_info(self, object):
|
||||
"""
|
||||
Delete the workflow instance from the database.
|
||||
|
||||
@rtype: Boolean
|
||||
@return: True on success, False otherwise.
|
||||
"""
|
||||
return self.db.delete(object)
|
||||
|
||||
|
||||
def get_task_info(self, **filter):
|
||||
"""
|
||||
Returns the tasks that match the given criteria.
|
||||
|
||||
@rtype: [TaskInfo]
|
||||
@return: A list of TaskInfo objects from the database.
|
||||
"""
|
||||
return self.db.get_task_info(**filter)
|
||||
|
||||
|
||||
def execute_task(self, task_info):
|
||||
if task_info is None:
|
||||
raise WorkflowServerException('task_info argument is None')
|
||||
if task_info.id is None:
|
||||
raise WorkflowServerException('task_info must be saved first')
|
||||
if task_info.status & task_info.WAITING == 0:
|
||||
raise WorkflowServerException('task is not in WAITING state')
|
||||
if task_info.job_id is None:
|
||||
raise WorkflowServerException('task_info must be associated with a job')
|
||||
job_info_list = self.get_job_info(id = task_info.job_id)
|
||||
if len(job_info_list) == 0:
|
||||
raise WorkflowServerException('Job not found')
|
||||
elif len(job_info_list) > 1:
|
||||
raise WorkflowServerException('Fatal error: More than one Job found')
|
||||
|
||||
job_info = job_info_list[0]
|
||||
if job_info.status is job_info.COMPLETED:
|
||||
raise WorkflowServerException('Job is already completed')
|
||||
result = job_info.instance.execute_task_from_id(task_info.node_id)
|
||||
self.__save_job_info(job_info)
|
||||
return result
|
|
@ -0,0 +1,18 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
class WorkflowServerException(Exception):
|
||||
pass
|
|
@ -0,0 +1,35 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
class JobInfo(object):
|
||||
"""
|
||||
This class represents an instance of a workflow.
|
||||
"""
|
||||
RUNNING, \
|
||||
COMPLETED = range(2)
|
||||
|
||||
def __init__(self, workflow_id = None, instance = None):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.id = None
|
||||
self.workflow_id = workflow_id
|
||||
self.status = self.RUNNING
|
||||
self.last_change = None
|
||||
self.instance = instance
|
||||
if instance is not None:
|
||||
if instance.is_completed():
|
||||
self.status = self.COMPLETED
|
|
@ -0,0 +1,40 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
class TaskInfo(object):
|
||||
"""
|
||||
This class represents a task in an instance of a workflow.
|
||||
"""
|
||||
WAITING = 1
|
||||
CANCELLED = 2
|
||||
COMPLETED = 4
|
||||
LIKELY = 8
|
||||
TRIGGERED = 16
|
||||
|
||||
def __init__(self, job_id = None, node = None):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.id = None
|
||||
self.job_id = job_id
|
||||
self.node_id = None
|
||||
self.name = None
|
||||
self.status = None
|
||||
self.last_change = None
|
||||
if node is not None:
|
||||
self.node_id = node.id
|
||||
self.name = node.task.name
|
||||
self.status = node.state
|
|
@ -0,0 +1,34 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
|
||||
class WorkflowInfo(object):
|
||||
"""
|
||||
This class represents a workflow definition.
|
||||
"""
|
||||
|
||||
def __init__(self, handle, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
assert not (kwargs.has_key('xml') and kwargs.has_key('file'))
|
||||
self.id = None
|
||||
self.handle = handle
|
||||
self.name = handle
|
||||
self.xml = kwargs.get('xml', None)
|
||||
if kwargs.has_key('file'):
|
||||
file = open(kwargs.get('file'), 'r')
|
||||
self.xml = file.read()
|
||||
file.close()
|
|
@ -0,0 +1,9 @@
|
|||
from DB import DB
|
||||
from Driver import Driver
|
||||
from WorkflowInfo import WorkflowInfo
|
||||
from JobInfo import JobInfo
|
||||
from TaskInfo import TaskInfo
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,249 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import os, sys
|
||||
import xml.dom.minidom as minidom
|
||||
import SpiffWorkflow
|
||||
import SpiffWorkflow.Tasks
|
||||
import SpiffWorkflow.Operators
|
||||
from SpiffWorkflow.Exception import StorageException
|
||||
|
||||
class OpenWfeXmlReader(object):
|
||||
"""
|
||||
Parses OpenWFE XML into a workflow object.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.task_tags = ('task',
|
||||
'concurrence',
|
||||
'if',
|
||||
'sequence')
|
||||
self.op_map = {'equals': SpiffWorkflow.Operators.Equal,
|
||||
'not-equals': SpiffWorkflow.Operators.NotEqual,
|
||||
'less-than': SpiffWorkflow.Operators.LessThan,
|
||||
'greater-than': SpiffWorkflow.Operators.GreaterThan,
|
||||
'matches': SpiffWorkflow.Operators.Match}
|
||||
|
||||
|
||||
def _raise(self, error):
|
||||
raise StorageException('%s in XML file.' % error)
|
||||
|
||||
|
||||
def read_condition(self, node):
|
||||
"""
|
||||
Reads the logical tag from the given node, returns a Condition object.
|
||||
|
||||
node -- the xml node (xml.dom.minidom.Node)
|
||||
"""
|
||||
term1 = node.getAttribute('field-value')
|
||||
op = node.nodeName.lower()
|
||||
term2 = node.getAttribute('other-value')
|
||||
if not self.op_map.has_key(op):
|
||||
self._raise('Invalid operator')
|
||||
return self.op_map[op](SpiffWorkflow.Operators.Attrib(term1),
|
||||
SpiffWorkflow.Operators.Attrib(term2))
|
||||
|
||||
|
||||
def read_if(self, workflow, start_node):
|
||||
"""
|
||||
Reads the sequence from the given node.
|
||||
|
||||
workflow -- the workflow with which the concurrence is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
assert start_node.nodeName.lower() == 'if'
|
||||
name = start_node.getAttribute('name').lower()
|
||||
|
||||
# Collect all information.
|
||||
match = None
|
||||
nomatch = None
|
||||
condition = None
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName.lower() in self.task_tags:
|
||||
if match is None:
|
||||
match = self.read_task(workflow, node)
|
||||
elif nomatch is None:
|
||||
nomatch = self.read_task(workflow, node)
|
||||
else:
|
||||
assert False # Only two tasks in "if" allowed.
|
||||
elif node.nodeName.lower() in self.op_map:
|
||||
if condition is None:
|
||||
condition = self.read_condition(node)
|
||||
else:
|
||||
assert False # Multiple conditions not yet supported.
|
||||
else:
|
||||
print "Unknown type:", type
|
||||
assert False # Unknown tag.
|
||||
|
||||
# Model the if statement.
|
||||
assert condition is not None
|
||||
assert match is not None
|
||||
choice = SpiffWorkflow.Tasks.ExclusiveChoice(workflow, name)
|
||||
end = SpiffWorkflow.Tasks.Task(workflow, name + '_end')
|
||||
if nomatch is None:
|
||||
choice.connect(end)
|
||||
else:
|
||||
choice.connect(nomatch[0])
|
||||
nomatch[1].connect(end)
|
||||
choice.connect_if(condition, match[0])
|
||||
match[1].connect(end)
|
||||
|
||||
return (choice, end)
|
||||
|
||||
|
||||
def read_sequence(self, workflow, start_node):
|
||||
"""
|
||||
Reads the children of the given node in sequential order.
|
||||
Returns a tuple (start, end) that contains the stream of objects
|
||||
that model the behavior.
|
||||
|
||||
workflow -- the workflow with which the concurrence is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
assert start_node.nodeName.lower() == 'sequence'
|
||||
name = start_node.getAttribute('name').lower()
|
||||
first = None
|
||||
last = None
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName.lower() in self.task_tags:
|
||||
(start, end) = self.read_task(workflow, node)
|
||||
if first is None:
|
||||
first = start
|
||||
else:
|
||||
last.connect(start)
|
||||
last = end
|
||||
else:
|
||||
print "Unknown type:", type
|
||||
assert False # Unknown tag.
|
||||
return (first, last)
|
||||
|
||||
|
||||
def read_concurrence(self, workflow, start_node):
|
||||
"""
|
||||
Reads the concurrence from the given node.
|
||||
|
||||
workflow -- the workflow with which the concurrence is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
assert start_node.nodeName.lower() == 'concurrence'
|
||||
name = start_node.getAttribute('name').lower()
|
||||
multichoice = SpiffWorkflow.Tasks.MultiChoice(workflow, name)
|
||||
synchronize = SpiffWorkflow.Tasks.Join(workflow, name + '_end', name)
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName.lower() in self.task_tags:
|
||||
(start, end) = self.read_task(workflow, node)
|
||||
multichoice.connect_if(None, start)
|
||||
end.connect(synchronize)
|
||||
else:
|
||||
print "Unknown type:", type
|
||||
assert False # Unknown tag.
|
||||
return (multichoice, synchronize)
|
||||
|
||||
|
||||
def read_task(self, workflow, start_node):
|
||||
"""
|
||||
Reads the task from the given node and returns a tuple
|
||||
(start, end) that contains the stream of objects that model
|
||||
the behavior.
|
||||
|
||||
workflow -- the workflow with which the task is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
type = start_node.nodeName.lower()
|
||||
name = start_node.getAttribute('name').lower()
|
||||
assert type in self.task_tags
|
||||
|
||||
if type == 'concurrence':
|
||||
return self.read_concurrence(workflow, start_node)
|
||||
elif type == 'if':
|
||||
return self.read_if(workflow, start_node)
|
||||
elif type == 'sequence':
|
||||
return self.read_sequence(workflow, start_node)
|
||||
elif type == 'task':
|
||||
task = SpiffWorkflow.Tasks.Task(workflow, name)
|
||||
return (task, task)
|
||||
else:
|
||||
print "Unknown type:", type
|
||||
assert False # Unknown tag.
|
||||
|
||||
|
||||
def read_workflow(self, start_node):
|
||||
"""
|
||||
Reads the workflow from the given workflow node and returns a workflow
|
||||
object.
|
||||
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
name = start_node.getAttribute('name')
|
||||
assert name is not None
|
||||
workflow = SpiffWorkflow.Workflow(name)
|
||||
last_task = workflow.start
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName == 'description':
|
||||
pass
|
||||
elif node.nodeName.lower() in self.task_tags:
|
||||
(start, end) = self.read_task(workflow, node)
|
||||
last_task.connect(start)
|
||||
last_task = end
|
||||
else:
|
||||
print "Unknown type:", type
|
||||
assert False # Unknown tag.
|
||||
|
||||
last_task.connect(SpiffWorkflow.Tasks.Task(workflow, 'End'))
|
||||
return workflow
|
||||
|
||||
|
||||
def read(self, xml):
|
||||
"""
|
||||
Reads all workflows from the given XML structure and returns a
|
||||
list of workflow object.
|
||||
|
||||
xml -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
workflows = []
|
||||
for node in xml.getElementsByTagName('process-definition'):
|
||||
workflows.append(self.read_workflow(node))
|
||||
return workflows
|
||||
|
||||
|
||||
def parse_string(self, string):
|
||||
"""
|
||||
Reads the workflow XML from the given string and returns a workflow
|
||||
object.
|
||||
|
||||
string -- the name of the file (string)
|
||||
"""
|
||||
return self.read(minidom.parseString(string))
|
||||
|
||||
|
||||
def parse_file(self, filename):
|
||||
"""
|
||||
Reads the workflow XML from the given file and returns a workflow
|
||||
object.
|
||||
|
||||
filename -- the name of the file (string)
|
||||
"""
|
||||
return self.read(minidom.parse(filename))
|
|
@ -0,0 +1,367 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import os, re
|
||||
import xml.dom.minidom as minidom
|
||||
import SpiffWorkflow
|
||||
import SpiffWorkflow.Tasks
|
||||
import SpiffWorkflow.Operators
|
||||
from SpiffWorkflow.Exception import StorageException
|
||||
|
||||
class XmlReader(object):
|
||||
"""
|
||||
Parses XML into a workflow object.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.read_tasks = {}
|
||||
|
||||
# Create a list of tag names out of the task names.
|
||||
self.task_map = {}
|
||||
for name in dir(SpiffWorkflow.Tasks):
|
||||
if name.startswith('_'):
|
||||
continue
|
||||
module = SpiffWorkflow.Tasks.__dict__[name]
|
||||
name = re.sub(r'(.)([A-Z])', r'\1-\2', name).lower()
|
||||
self.task_map[name] = module
|
||||
|
||||
self.op_map = {'equals': SpiffWorkflow.Operators.Equal,
|
||||
'not-equals': SpiffWorkflow.Operators.NotEqual,
|
||||
'less-than': SpiffWorkflow.Operators.LessThan,
|
||||
'greater-than': SpiffWorkflow.Operators.GreaterThan,
|
||||
'matches': SpiffWorkflow.Operators.Match}
|
||||
|
||||
|
||||
def _raise(self, error):
|
||||
raise StorageException('%s in XML file.' % error)
|
||||
|
||||
|
||||
def _read_assign(self, workflow, start_node):
|
||||
"""
|
||||
Reads the "pre-assign" or "post-assign" tag from the given node.
|
||||
|
||||
start_node -- the xml node (xml.dom.minidom.Node)
|
||||
"""
|
||||
name = start_node.getAttribute('name')
|
||||
attrib = start_node.getAttribute('field')
|
||||
value = start_node.getAttribute('value')
|
||||
kwargs = {}
|
||||
if name == '':
|
||||
self._raise('name attribute required')
|
||||
if attrib != '' and value != '':
|
||||
self._raise('Both, field and right-value attributes found')
|
||||
elif attrib == '' and value == '':
|
||||
self._raise('field or value attribute required')
|
||||
elif value != '':
|
||||
kwargs['right'] = value
|
||||
else:
|
||||
kwargs['right_attribute'] = attrib
|
||||
return SpiffWorkflow.Tasks.Assign(name, **kwargs)
|
||||
|
||||
|
||||
def _read_property(self, workflow, start_node):
|
||||
"""
|
||||
Reads a "property" or "define" tag from the given node.
|
||||
|
||||
start_node -- the xml node (xml.dom.minidom.Node)
|
||||
"""
|
||||
name = start_node.getAttribute('name')
|
||||
value = start_node.getAttribute('value')
|
||||
return name, value
|
||||
|
||||
|
||||
def _read_assign_list(self, workflow, start_node):
|
||||
"""
|
||||
Reads a list of assignments from the given node.
|
||||
|
||||
workflow -- the workflow
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
# Collect all information.
|
||||
assignments = []
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName.lower() == 'assign':
|
||||
assignments.append(self._read_assign(workflow, node))
|
||||
else:
|
||||
self._raise('Unknown node: %s' % node.nodeName)
|
||||
return assignments
|
||||
|
||||
|
||||
def _read_logical(self, node):
|
||||
"""
|
||||
Reads the logical tag from the given node, returns a Condition object.
|
||||
|
||||
node -- the xml node (xml.dom.minidom.Node)
|
||||
"""
|
||||
term1_attrib = node.getAttribute('left-field')
|
||||
term1_value = node.getAttribute('left-value')
|
||||
op = node.nodeName.lower()
|
||||
term2_attrib = node.getAttribute('right-field')
|
||||
term2_value = node.getAttribute('right-value')
|
||||
kwargs = {}
|
||||
if not self.op_map.has_key(op):
|
||||
self._raise('Invalid operator')
|
||||
if term1_attrib != '' and term1_value != '':
|
||||
self._raise('Both, left-field and left-value attributes found')
|
||||
elif term1_attrib == '' and term1_value == '':
|
||||
self._raise('left-field or left-value attribute required')
|
||||
elif term1_value != '':
|
||||
left = term1_value
|
||||
else:
|
||||
left = SpiffWorkflow.Operators.Attrib(term1_attrib)
|
||||
if term2_attrib != '' and term2_value != '':
|
||||
self._raise('Both, right-field and right-value attributes found')
|
||||
elif term2_attrib == '' and term2_value == '':
|
||||
self._raise('right-field or right-value attribute required')
|
||||
elif term2_value != '':
|
||||
right = term2_value
|
||||
else:
|
||||
right = SpiffWorkflow.Operators.Attrib(term2_attrib)
|
||||
return self.op_map[op](left, right)
|
||||
|
||||
|
||||
def _read_condition(self, workflow, start_node):
|
||||
"""
|
||||
Reads the conditional statement from the given node.
|
||||
|
||||
workflow -- the workflow with which the concurrence is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
# Collect all information.
|
||||
condition = None
|
||||
task_name = None
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName.lower() == 'successor':
|
||||
if task_name is not None:
|
||||
self._raise('Duplicate task name %s' % task_name)
|
||||
if node.firstChild is None:
|
||||
self._raise('Successor tag without a task name')
|
||||
task_name = node.firstChild.nodeValue
|
||||
elif node.nodeName.lower() in self.op_map:
|
||||
if condition is not None:
|
||||
self._raise('Multiple conditions are not yet supported')
|
||||
condition = self._read_logical(node)
|
||||
else:
|
||||
self._raise('Unknown node: %s' % node.nodeName)
|
||||
|
||||
if condition is None:
|
||||
self._raise('Missing condition in conditional statement')
|
||||
if task_name is None:
|
||||
self._raise('A %s has no task specified' % start_node.nodeName)
|
||||
return (condition, task_name)
|
||||
|
||||
|
||||
def read_task(self, workflow, start_node):
|
||||
"""
|
||||
Reads the task from the given node and returns a tuple
|
||||
(start, end) that contains the stream of objects that model
|
||||
the behavior.
|
||||
|
||||
workflow -- the workflow with which the task is associated
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
# Extract attributes from the node.
|
||||
nodetype = start_node.nodeName.lower()
|
||||
name = start_node.getAttribute('name').lower()
|
||||
context = start_node.getAttribute('context').lower()
|
||||
mutex = start_node.getAttribute('mutex').lower()
|
||||
cancel = start_node.getAttribute('cancel').lower()
|
||||
success = start_node.getAttribute('success').lower()
|
||||
times = start_node.getAttribute('times').lower()
|
||||
times_field = start_node.getAttribute('times-field').lower()
|
||||
threshold = start_node.getAttribute('threshold').lower()
|
||||
threshold_field = start_node.getAttribute('threshold-field').lower()
|
||||
file = start_node.getAttribute('file').lower()
|
||||
file_field = start_node.getAttribute('file-field').lower()
|
||||
kwargs = {'lock': [],
|
||||
'properties': {},
|
||||
'defines': {},
|
||||
'pre_assign': [],
|
||||
'post_assign': []}
|
||||
if not self.task_map.has_key(nodetype):
|
||||
self._raise('Invalid task type "%s"' % nodetype)
|
||||
if nodetype == 'start-task':
|
||||
name = 'start'
|
||||
if name == '':
|
||||
self._raise('Invalid task name "%s"' % name)
|
||||
if self.read_tasks.has_key(name):
|
||||
self._raise('Duplicate task name "%s"' % name)
|
||||
if cancel != '' and cancel != u'0':
|
||||
kwargs['cancel'] = True
|
||||
if success != '' and success != u'0':
|
||||
kwargs['success'] = True
|
||||
if times != '':
|
||||
kwargs['times'] = int(times)
|
||||
if times_field != '':
|
||||
kwargs['times'] = SpiffWorkflow.Operators.Attrib(times_field)
|
||||
if threshold != '':
|
||||
kwargs['threshold'] = int(threshold)
|
||||
if threshold_field != '':
|
||||
kwargs['threshold'] = SpiffWorkflow.Operators.Attrib(threshold_field)
|
||||
if file != '':
|
||||
kwargs['file'] = file
|
||||
if file_field != '':
|
||||
kwargs['file'] = SpiffWorkflow.Operators.Attrib(file_field)
|
||||
if nodetype == 'choose':
|
||||
kwargs['choice'] = []
|
||||
if nodetype == 'trigger':
|
||||
context = [context]
|
||||
if mutex != '':
|
||||
context = mutex
|
||||
|
||||
# Walk through the children of the node.
|
||||
successors = []
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName == 'description':
|
||||
kwargs['description'] = node.firstChild.nodeValue
|
||||
elif node.nodeName == 'successor' \
|
||||
or node.nodeName == 'default-successor':
|
||||
if node.firstChild is None:
|
||||
self._raise('Empty %s tag' % node.nodeName)
|
||||
successors.append((None, node.firstChild.nodeValue))
|
||||
elif node.nodeName == 'conditional-successor':
|
||||
successors.append(self._read_condition(workflow, node))
|
||||
elif node.nodeName == 'define':
|
||||
key, value = self._read_property(workflow, node)
|
||||
kwargs['defines'][key] = value
|
||||
elif node.nodeName == 'property':
|
||||
key, value = self._read_property(workflow, node)
|
||||
kwargs['properties'][key] = value
|
||||
elif node.nodeName == 'pre-assign':
|
||||
kwargs['pre_assign'].append(self._read_assign(workflow, node))
|
||||
elif node.nodeName == 'post-assign':
|
||||
kwargs['post_assign'].append(self._read_assign(workflow, node))
|
||||
elif node.nodeName == 'in':
|
||||
kwargs['in_assign'] = self._read_assign_list(workflow, node)
|
||||
elif node.nodeName == 'out':
|
||||
kwargs['out_assign'] = self._read_assign_list(workflow, node)
|
||||
elif node.nodeName == 'cancel':
|
||||
if node.firstChild is None:
|
||||
self._raise('Empty %s tag' % node.nodeName)
|
||||
if context == '':
|
||||
context = []
|
||||
elif type(context) != type([]):
|
||||
context = [context]
|
||||
context.append(node.firstChild.nodeValue)
|
||||
elif node.nodeName == 'lock':
|
||||
if node.firstChild is None:
|
||||
self._raise('Empty %s tag' % node.nodeName)
|
||||
kwargs['lock'].append(node.firstChild.nodeValue)
|
||||
elif node.nodeName == 'pick':
|
||||
if node.firstChild is None:
|
||||
self._raise('Empty %s tag' % node.nodeName)
|
||||
kwargs['choice'].append(node.firstChild.nodeValue)
|
||||
else:
|
||||
self._raise('Unknown node: %s' % node.nodeName)
|
||||
|
||||
# Create a new instance of the task.
|
||||
module = self.task_map[nodetype]
|
||||
if nodetype == 'start-task':
|
||||
task = module(workflow, **kwargs)
|
||||
elif nodetype == 'multi-instance' or nodetype == 'thread-split':
|
||||
if times == '' and times_field == '':
|
||||
self._raise('Missing "times" or "times-field" in "%s"' % name)
|
||||
elif times != '' and times_field != '':
|
||||
self._raise('Both, "times" and "times-field" in "%s"' % name)
|
||||
task = module(workflow, name, **kwargs)
|
||||
elif context == '':
|
||||
task = module(workflow, name, **kwargs)
|
||||
else:
|
||||
task = module(workflow, name, context, **kwargs)
|
||||
|
||||
self.read_tasks[name] = (task, successors)
|
||||
|
||||
|
||||
def _read_workflow(self, start_node, filename = None):
|
||||
"""
|
||||
Reads the workflow from the given workflow node and returns a workflow
|
||||
object.
|
||||
|
||||
start_node -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
name = start_node.getAttribute('name')
|
||||
if name == '':
|
||||
self._raise('%s without a name attribute' % start_node.nodeName)
|
||||
|
||||
# Read all tasks and create a list of successors.
|
||||
workflow = SpiffWorkflow.Workflow(name, filename)
|
||||
self.read_tasks = {'end': (SpiffWorkflow.Tasks.Task(workflow, 'End'), [])}
|
||||
for node in start_node.childNodes:
|
||||
if node.nodeType != minidom.Node.ELEMENT_NODE:
|
||||
continue
|
||||
if node.nodeName == 'description':
|
||||
workflow.description = node.firstChild.nodeValue
|
||||
elif self.task_map.has_key(node.nodeName.lower()):
|
||||
self.read_task(workflow, node)
|
||||
else:
|
||||
self._raise('Unknown node: %s' % node.nodeName)
|
||||
|
||||
# Remove the default start-task from the workflow.
|
||||
workflow.start = self.read_tasks['start'][0]
|
||||
|
||||
# Connect all tasks.
|
||||
for name in self.read_tasks:
|
||||
task, successors = self.read_tasks[name]
|
||||
for condition, successor_name in successors:
|
||||
if not self.read_tasks.has_key(successor_name):
|
||||
self._raise('Unknown successor: "%s"' % successor_name)
|
||||
successor, foo = self.read_tasks[successor_name]
|
||||
if condition is None:
|
||||
task.connect(successor)
|
||||
else:
|
||||
task.connect_if(condition, successor)
|
||||
return workflow
|
||||
|
||||
|
||||
def read(self, xml, filename = None):
|
||||
"""
|
||||
Reads all workflows from the given XML structure and returns a
|
||||
list of workflow object.
|
||||
|
||||
xml -- the xml structure (xml.dom.minidom.Node)
|
||||
"""
|
||||
workflows = []
|
||||
for node in xml.getElementsByTagName('process-definition'):
|
||||
workflows.append(self._read_workflow(node, filename))
|
||||
return workflows
|
||||
|
||||
|
||||
def parse_string(self, string):
|
||||
"""
|
||||
Reads the workflow XML from the given string and returns a workflow
|
||||
object.
|
||||
|
||||
string -- the name of the file (string)
|
||||
"""
|
||||
return self.read(minidom.parseString(string))
|
||||
|
||||
|
||||
def parse_file(self, filename):
|
||||
"""
|
||||
Reads the workflow XML from the given file and returns a workflow
|
||||
object.
|
||||
|
||||
filename -- the name of the file (string)
|
||||
"""
|
||||
return self.read(minidom.parse(filename), filename)
|
|
@ -0,0 +1,6 @@
|
|||
from OpenWfeXmlReader import OpenWfeXmlReader
|
||||
from XmlReader import XmlReader
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,553 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import time
|
||||
from Exception import WorkflowException
|
||||
|
||||
class TaskInstance(object):
|
||||
"""
|
||||
This class implements a node for composing a tree that represents the
|
||||
taken/not yet taken path within the workflow.
|
||||
"""
|
||||
FUTURE = 1
|
||||
LIKELY = 2
|
||||
MAYBE = 4
|
||||
WAITING = 8
|
||||
READY = 16
|
||||
CANCELLED = 32
|
||||
COMPLETED = 64
|
||||
TRIGGERED = 128
|
||||
|
||||
FINISHED_MASK = CANCELLED | COMPLETED
|
||||
DEFINITE_MASK = FUTURE | WAITING | READY | FINISHED_MASK
|
||||
PREDICTED_MASK = FUTURE | LIKELY | MAYBE
|
||||
NOT_FINISHED_MASK = PREDICTED_MASK | WAITING | READY
|
||||
ANY_MASK = FINISHED_MASK | NOT_FINISHED_MASK
|
||||
|
||||
state_names = {FUTURE: 'FUTURE',
|
||||
WAITING: 'WAITING',
|
||||
READY: 'READY',
|
||||
CANCELLED: 'CANCELLED',
|
||||
COMPLETED: 'COMPLETED',
|
||||
LIKELY: 'LIKELY',
|
||||
MAYBE: 'MAYBE',
|
||||
TRIGGERED: 'TRIGGERED'}
|
||||
|
||||
class Iterator(object):
|
||||
"""
|
||||
This is a tree iterator that supports filtering such that a client
|
||||
may walk through all nodes that have a specific state.
|
||||
"""
|
||||
def __init__(self, current, filter = None):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.filter = filter
|
||||
self.path = [current]
|
||||
|
||||
|
||||
def __iter__(self):
|
||||
return self
|
||||
|
||||
|
||||
def _next(self):
|
||||
# Make sure that the end is not yet reached.
|
||||
if len(self.path) == 0:
|
||||
raise StopIteration()
|
||||
|
||||
# If the current node has children, the first child is the next item.
|
||||
# If the current node is LIKELY, and predicted nodes are not
|
||||
# specificly searched, we can ignore the children, because predicted
|
||||
# nodes should only have predicted children.
|
||||
current = self.path[-1]
|
||||
ignore_node = False
|
||||
if self.filter is not None:
|
||||
search_predicted = self.filter & TaskInstance.LIKELY != 0
|
||||
is_predicted = current.state & TaskInstance.LIKELY != 0
|
||||
ignore_node = is_predicted and not search_predicted
|
||||
if len(current.children) > 0 and not ignore_node:
|
||||
self.path.append(current.children[0])
|
||||
if self.filter is not None and current.state & self.filter == 0:
|
||||
return None
|
||||
return current
|
||||
|
||||
# Ending up here, this node has no children. Crop the path until we
|
||||
# reach a node that has unvisited children, or until we hit the end.
|
||||
while True:
|
||||
old_child = self.path.pop(-1)
|
||||
if len(self.path) == 0:
|
||||
break
|
||||
|
||||
# If this node has a sibling, choose it.
|
||||
parent = self.path[-1]
|
||||
pos = parent.children.index(old_child)
|
||||
if len(parent.children) > pos + 1:
|
||||
self.path.append(parent.children[pos + 1])
|
||||
break
|
||||
if self.filter is not None and current.state & self.filter == 0:
|
||||
return None
|
||||
return current
|
||||
|
||||
|
||||
def next(self):
|
||||
# By using this loop we avoid an (expensive) recursive call.
|
||||
while True:
|
||||
next = self._next()
|
||||
if next is not None:
|
||||
return next
|
||||
|
||||
|
||||
# Pool for assigning a unique id to every new TaskInstance.
|
||||
id_pool = 0
|
||||
thread_id_pool = 0
|
||||
|
||||
def __init__(self, job, task, parent = None):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
assert job is not None
|
||||
assert task is not None
|
||||
self.__class__.id_pool += 1
|
||||
self.job = job
|
||||
self.parent = parent
|
||||
self.children = []
|
||||
self.state = TaskInstance.FUTURE
|
||||
self.task = task
|
||||
self.id = self.__class__.id_pool
|
||||
self.thread_id = self.__class__.thread_id_pool
|
||||
self.last_state_change = time.time()
|
||||
self.attributes = {}
|
||||
self.internal_attributes = {}
|
||||
if parent is not None:
|
||||
self.parent._child_added_notify(self)
|
||||
|
||||
|
||||
def __iter__(self):
|
||||
return TaskInstance.Iterator(self)
|
||||
|
||||
|
||||
def __setstate__(self, dict):
|
||||
self.__dict__.update(dict)
|
||||
# If unpickled in the same Python process in which a workflow
|
||||
# (TaskInstance) is built through the API, we need to make sure
|
||||
# that there will not be any ID collisions.
|
||||
if dict['id'] >= self.__class__.id_pool:
|
||||
self.__class__.id_pool = dict['id']
|
||||
if dict['thread_id'] >= self.__class__.thread_id_pool:
|
||||
self.__class__.thread_id_pool = dict['thread_id']
|
||||
|
||||
|
||||
def _get_root(self):
|
||||
"""
|
||||
Returns the top level parent.
|
||||
"""
|
||||
if self.parent is None:
|
||||
return self
|
||||
return self.parent._get_root()
|
||||
|
||||
|
||||
def _get_depth(self):
|
||||
depth = 0
|
||||
node = self.parent
|
||||
while node is not None:
|
||||
depth += 1
|
||||
node = node.parent
|
||||
return depth
|
||||
|
||||
|
||||
def _child_added_notify(self, child):
|
||||
"""
|
||||
Called by another TaskInstance to let us know that a child was added.
|
||||
"""
|
||||
assert child is not None
|
||||
self.children.append(child)
|
||||
|
||||
|
||||
def _drop_children(self):
|
||||
drop = []
|
||||
for child in self.children:
|
||||
if not child._is_finished():
|
||||
drop.append(child)
|
||||
else:
|
||||
child._drop_children()
|
||||
for node in drop:
|
||||
self.children.remove(node)
|
||||
|
||||
|
||||
def _set_state(self, state):
|
||||
self.state = state
|
||||
self.last_state_change = time.time()
|
||||
|
||||
|
||||
def _has_state(self, state):
|
||||
"""
|
||||
Returns True if the TaskInstance has the given state flag set.
|
||||
"""
|
||||
return (self.state & state) != 0
|
||||
|
||||
|
||||
def _is_finished(self):
|
||||
return self.state & self.FINISHED_MASK != 0
|
||||
|
||||
|
||||
def _is_predicted(self):
|
||||
return self.state & self.PREDICTED_MASK != 0
|
||||
|
||||
|
||||
def _is_definite(self):
|
||||
return self.state & self.DEFINITE_MASK != 0
|
||||
|
||||
|
||||
def _add_child(self, task, state = FUTURE):
|
||||
"""
|
||||
Adds a new child node and assigns the given task to the new node.
|
||||
|
||||
task -- the task that is assigned to the new node.
|
||||
state -- the initial node state
|
||||
"""
|
||||
if task is None:
|
||||
raise WorkflowException(self, '_add_child() requires a task.')
|
||||
if self._is_predicted() and state & self.PREDICTED_MASK == 0:
|
||||
msg = 'Attempt to add non-predicted child to predicted node'
|
||||
raise WorkflowException(self, msg)
|
||||
node = TaskInstance(self.job, task, self)
|
||||
node.thread_id = self.thread_id
|
||||
if state == self.READY:
|
||||
node._ready()
|
||||
else:
|
||||
node.state = state
|
||||
return node
|
||||
|
||||
|
||||
def _assign_new_thread_id(self, recursive = True):
|
||||
"""
|
||||
Assigns a new thread id to the node.
|
||||
Returns the new id.
|
||||
"""
|
||||
self.__class__.thread_id_pool += 1
|
||||
self.thread_id = self.__class__.thread_id_pool
|
||||
if not recursive:
|
||||
return self.thread_id
|
||||
for node in self:
|
||||
node.thread_id = self.thread_id
|
||||
return self.thread_id
|
||||
|
||||
|
||||
def _update_children(self, tasks, state = None):
|
||||
"""
|
||||
This method adds one child for each given task, unless that
|
||||
child already exists.
|
||||
The state of COMPLETED tasks is never changed.
|
||||
|
||||
If this method is passed a state:
|
||||
- The state of TRIGGERED tasks is not changed.
|
||||
- The state for all children is set to the given value.
|
||||
|
||||
If this method is not passed a state:
|
||||
The state for all children is updated by calling the child's
|
||||
_update_state() method.
|
||||
|
||||
If the node currently has a child that is not given in the tasks,
|
||||
the child is removed.
|
||||
It is an error if the node has a non-LIKELY child that is
|
||||
not given in the tasks.
|
||||
|
||||
task -- the list of tasks that may become children.
|
||||
state -- the state for newly added children
|
||||
"""
|
||||
if tasks is None:
|
||||
raise WorkflowException(self, '"tasks" argument is None.')
|
||||
if type(tasks) != type([]):
|
||||
tasks = [tasks]
|
||||
|
||||
# Create a list of all children that are no longer needed, and
|
||||
# set the state of all others.
|
||||
add = tasks[:]
|
||||
remove = []
|
||||
for child in self.children:
|
||||
# Must not be TRIGGERED or COMPLETED.
|
||||
if child._has_state(TaskInstance.TRIGGERED):
|
||||
if state is None:
|
||||
child.task._update_state(child)
|
||||
continue
|
||||
if child._is_finished():
|
||||
add.remove(child.task)
|
||||
continue
|
||||
|
||||
# Check whether the item needs to be added or removed.
|
||||
if child.task not in add:
|
||||
if not self._is_definite():
|
||||
msg = 'Attempt to remove non-predicted %s' % child.get_name()
|
||||
raise WorkflowException(self, msg)
|
||||
remove.append(child)
|
||||
continue
|
||||
add.remove(child.task)
|
||||
|
||||
# Update the state.
|
||||
if state is not None:
|
||||
child.state = state
|
||||
else:
|
||||
child.task._update_state(child)
|
||||
|
||||
# Remove all children that are no longer specified.
|
||||
for child in remove:
|
||||
self.children.remove(child)
|
||||
|
||||
# Add a new child for each of the remaining tasks.
|
||||
for task in add:
|
||||
if task.cancelled:
|
||||
continue
|
||||
if state is not None:
|
||||
self._add_child(task, state)
|
||||
else:
|
||||
node = self._add_child(task, self.LIKELY)
|
||||
task._update_state(node)
|
||||
|
||||
|
||||
def _set_likely_task(self, tasks):
|
||||
if type(tasks) != type([]):
|
||||
tasks = [tasks]
|
||||
for task in tasks:
|
||||
for child in self.children:
|
||||
if child.task != task:
|
||||
continue
|
||||
if child._is_definite():
|
||||
continue
|
||||
child._set_state(self.LIKELY)
|
||||
return
|
||||
|
||||
|
||||
def _is_descendant_of(self, parent):
|
||||
"""
|
||||
Returns True if parent is in the list of ancestors, returns False
|
||||
otherwise.
|
||||
|
||||
parent -- the parent that is searched in the ancestors.
|
||||
"""
|
||||
if self.parent is None:
|
||||
return False
|
||||
if self.parent == parent:
|
||||
return True
|
||||
return self.parent._is_descendant_of(parent)
|
||||
|
||||
|
||||
def _find_child_of(self, parent_task):
|
||||
"""
|
||||
Returns the ancestor that has a TaskInstance with the given Task
|
||||
as a parent.
|
||||
If no such ancestor was found, the root node is returned.
|
||||
|
||||
parent_task -- the wanted parent Task
|
||||
"""
|
||||
if self.parent is None:
|
||||
return self
|
||||
if self.parent.task == parent_task:
|
||||
return self
|
||||
return self.parent._find_child_of(parent_task)
|
||||
|
||||
|
||||
def _find_any(self, task):
|
||||
"""
|
||||
Returns any descendants that have the given task assigned.
|
||||
|
||||
task -- the wanted task
|
||||
"""
|
||||
instances = []
|
||||
if self.task == task:
|
||||
instances.append(self)
|
||||
for node in self:
|
||||
if node.task != task:
|
||||
continue
|
||||
instances.append(node)
|
||||
return instances
|
||||
|
||||
|
||||
def _find_ancestor(self, task):
|
||||
"""
|
||||
Returns the ancestor that has the given task assigned.
|
||||
If no such ancestor was found, the root node is returned.
|
||||
|
||||
task -- the wanted task
|
||||
"""
|
||||
if self.parent is None:
|
||||
return self
|
||||
if self.parent.task == task:
|
||||
return self.parent
|
||||
return self.parent._find_ancestor(task)
|
||||
|
||||
|
||||
def _find_ancestor_from_name(self, name):
|
||||
"""
|
||||
Returns the ancestor that has a task with the given name assigned.
|
||||
Returns None if no such ancestor was found.
|
||||
|
||||
task -- the wanted task
|
||||
"""
|
||||
if self.parent is None:
|
||||
return None
|
||||
if self.parent.get_name() == name:
|
||||
return self.parent
|
||||
return self.parent._find_ancestor_from_name(name)
|
||||
|
||||
|
||||
def _ready(self):
|
||||
"""
|
||||
Marks the node as ready for execution.
|
||||
"""
|
||||
if self.state & self.COMPLETED != 0:
|
||||
return
|
||||
if self.state & self.CANCELLED != 0:
|
||||
return
|
||||
self._set_state(self.READY | (self.state & self.TRIGGERED))
|
||||
return self.task._on_ready(self)
|
||||
|
||||
|
||||
def get_name(self):
|
||||
return str(self.task.name)
|
||||
|
||||
|
||||
def get_description(self):
|
||||
return str(self.task.description)
|
||||
|
||||
|
||||
def get_state(self):
|
||||
"""
|
||||
Returns this TaskInstance's state.
|
||||
"""
|
||||
return self.state
|
||||
|
||||
|
||||
def get_state_name(self):
|
||||
"""
|
||||
Returns a textual representation of this TaskInstance's state.
|
||||
"""
|
||||
state_name = []
|
||||
for key, name in self.state_names.iteritems():
|
||||
if self.state & key != 0:
|
||||
state_name.append(name)
|
||||
return '|'.join(state_name)
|
||||
|
||||
|
||||
def get_property(self, name, default = None):
|
||||
"""
|
||||
Returns the value of the property with the given name, or the given
|
||||
default value if the property does not exist.
|
||||
|
||||
name -- a property name (string)
|
||||
default -- the default value that is returned if the property does
|
||||
not exist.
|
||||
"""
|
||||
return self.task.get_property(name, default)
|
||||
|
||||
|
||||
def get_properties(self):
|
||||
"""
|
||||
Returns a dictionary containing all properties.
|
||||
"""
|
||||
return self.task.properties
|
||||
|
||||
|
||||
def _set_internal_attribute(self, **kwargs):
|
||||
"""
|
||||
Defines the given attribute/value pairs.
|
||||
"""
|
||||
self.internal_attributes.update(kwargs)
|
||||
|
||||
|
||||
def _get_internal_attribute(self, name, default = None):
|
||||
return self.internal_attributes.get(name, default)
|
||||
|
||||
|
||||
def set_attribute(self, **kwargs):
|
||||
"""
|
||||
Defines the given attribute/value pairs.
|
||||
"""
|
||||
self.attributes.update(kwargs)
|
||||
|
||||
|
||||
def _inherit_attributes(self):
|
||||
"""
|
||||
Inherits the attributes from the parent.
|
||||
"""
|
||||
self.set_attribute(**self.parent.attributes)
|
||||
|
||||
|
||||
def get_attribute(self, name, default = None):
|
||||
"""
|
||||
Returns the value of the attribute with the given name, or the given
|
||||
default value if the attribute does not exist.
|
||||
|
||||
name -- an attribute name (string)
|
||||
default -- the default value that is returned if the attribute does
|
||||
not exist.
|
||||
"""
|
||||
return self.attributes.get(name, default)
|
||||
|
||||
|
||||
def get_attributes(self):
|
||||
return self.attributes
|
||||
|
||||
|
||||
def cancel(self):
|
||||
"""
|
||||
Cancels the item if it was not yet completed, and removes
|
||||
any children that are LIKELY.
|
||||
"""
|
||||
if self._is_finished():
|
||||
for child in self.children:
|
||||
child.cancel()
|
||||
return
|
||||
self._set_state(self.CANCELLED | (self.state & self.TRIGGERED))
|
||||
self._drop_children()
|
||||
return self.task._on_cancel(self)
|
||||
|
||||
|
||||
def complete(self):
|
||||
"""
|
||||
Called by the associated task to let us know that its state
|
||||
has changed (e.g. from FUTURE to COMPLETED.)
|
||||
"""
|
||||
self._set_state(self.COMPLETED | (self.state & self.TRIGGERED))
|
||||
return self.task._on_complete(self)
|
||||
|
||||
|
||||
def trigger(self, *args):
|
||||
"""
|
||||
If recursive is True, the state is applied to the tree recursively.
|
||||
"""
|
||||
self.task._on_trigger(self, *args)
|
||||
|
||||
|
||||
def get_dump(self, indent = 0, recursive = True):
|
||||
"""
|
||||
Returns the subtree as a string for debugging.
|
||||
"""
|
||||
dbg = (' ' * indent * 2)
|
||||
dbg += '%s/' % self.id
|
||||
dbg += '%s:' % self.thread_id
|
||||
dbg += ' TaskInstance of %s' % self.get_name()
|
||||
dbg += ' State: %s' % self.get_state_name()
|
||||
dbg += ' Children: %s' % len(self.children)
|
||||
if recursive:
|
||||
for child in self.children:
|
||||
dbg += '\n' + child.get_dump(indent + 1)
|
||||
return dbg
|
||||
|
||||
|
||||
def dump(self, indent = 0):
|
||||
"""
|
||||
Prints the subtree as a string for debugging.
|
||||
"""
|
||||
print self.get_dump()
|
|
@ -0,0 +1,58 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class AcquireMutex(Task):
|
||||
"""
|
||||
This class implements a task that acquires a mutex (lock), protecting
|
||||
a section of the workflow from being accessed by other sections.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, mutex, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
mutex -- the mutex that should be acquired
|
||||
"""
|
||||
assert mutex is not None
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.mutex = mutex
|
||||
|
||||
|
||||
def _update_state_hook(self, instance):
|
||||
mutex = instance.job.get_mutex(self.mutex)
|
||||
if mutex.testandset():
|
||||
return True
|
||||
instance._set_state(TaskInstance.WAITING)
|
||||
return False
|
||||
|
||||
|
||||
def _on_complete_hook(self, task_instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
task_instance -- the task_instance in which this method is executed
|
||||
"""
|
||||
return Task._on_complete_hook(self, task_instance)
|
|
@ -0,0 +1,61 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Task import Task
|
||||
|
||||
class CancelJob(Task):
|
||||
"""
|
||||
This class implements a trigger that cancels another task (branch).
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
kwargs -- may contain the following keys:
|
||||
lock -- a list of locks that is aquired on entry of
|
||||
execute() and released on leave of execute().
|
||||
pre_assign -- a list of attribute name/value pairs
|
||||
post_assign -- a list of attribute name/value pairs
|
||||
"""
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.cancel_successfully = kwargs.get('success', False)
|
||||
|
||||
|
||||
def test(self):
|
||||
"""
|
||||
Checks whether all required attributes are set. Throws an exception
|
||||
if an error was detected.
|
||||
"""
|
||||
Task.test(self)
|
||||
if len(self.outputs) > 0:
|
||||
raise WorkflowException(self, 'CancelJob with an output.')
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
instance.job.cancel(self.cancel_successfully)
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,41 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
from Trigger import Trigger
|
||||
|
||||
class CancelTask(Trigger):
|
||||
"""
|
||||
This class implements a trigger that cancels another task (branch).
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
for task_name in self.context:
|
||||
task = instance.job.get_task_from_name(task_name)
|
||||
for node in instance._get_root()._find_any(task):
|
||||
node.cancel()
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,63 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
from Trigger import Trigger
|
||||
|
||||
class Choose(Trigger):
|
||||
"""
|
||||
This class implements a task that causes an associated MultiChoice
|
||||
task to select the tasks with the specified name.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, context, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
context -- the name of the MultiChoice task that is instructed to
|
||||
select the specified outputs.
|
||||
kwargs -- may contain the following keys:
|
||||
choice -- the list of tasks that is selected.
|
||||
"""
|
||||
assert parent is not None
|
||||
assert name is not None
|
||||
assert context is not None
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.context = context
|
||||
self.choice = kwargs.get('choice', [])
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
context = instance.job.get_task_from_name(self.context)
|
||||
for node in instance.job.task_tree:
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if node.task == context:
|
||||
node.trigger(self.choice)
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,79 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import re
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from MultiChoice import MultiChoice
|
||||
|
||||
class ExclusiveChoice(MultiChoice):
|
||||
"""
|
||||
This class represents an exclusive choice (an if condition) task
|
||||
where precisely one outgoing instance is selected. If none of the
|
||||
given condition matches, a default task is selected.
|
||||
It has one or more inputs and two or more outputs.
|
||||
"""
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
"""
|
||||
MultiChoice.__init__(self, parent, name, **kwargs)
|
||||
self.default_task = None
|
||||
|
||||
|
||||
def connect(self, task):
|
||||
"""
|
||||
Connects the task that is executed if no other condition matches.
|
||||
|
||||
task -- the following task
|
||||
"""
|
||||
assert self.default_task is None
|
||||
self.outputs.append(task)
|
||||
self.default_task = task
|
||||
task._connect_notify(self)
|
||||
|
||||
|
||||
def test(self):
|
||||
"""
|
||||
Checks whether all required attributes are set. Throws an exception
|
||||
if an error was detected.
|
||||
"""
|
||||
MultiChoice.test(self)
|
||||
if self.default_task is None:
|
||||
raise WorkflowException(self, 'A default output is required.')
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
instance._update_children(self.outputs, TaskInstance.MAYBE)
|
||||
instance._set_likely_task(self.default_task)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
# Find the first matching condition.
|
||||
output = self.default_task
|
||||
for condition, task in self.cond_tasks:
|
||||
if condition is None or condition._matches(instance):
|
||||
output = task
|
||||
break
|
||||
|
||||
instance._update_children(output)
|
||||
return True
|
|
@ -0,0 +1,65 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class Gate(Task):
|
||||
"""
|
||||
This class implements a task that can only execute when another
|
||||
specified task is completed.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, context, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
context -- the name of the task that needs to complete before this
|
||||
task can execute.
|
||||
"""
|
||||
assert parent is not None
|
||||
assert name is not None
|
||||
assert context is not None
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.context = context
|
||||
|
||||
|
||||
def _update_state_hook(self, instance):
|
||||
task = instance.job.get_task_from_name(self.context)
|
||||
root_node = instance.job.task_tree
|
||||
for node in root_node._find_any(task):
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if not node._has_state(TaskInstance.COMPLETED):
|
||||
instance._set_state(TaskInstance.WAITING)
|
||||
return False
|
||||
return Task._update_state_hook(self, instance)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,245 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from SpiffWorkflow.Operators import valueof
|
||||
from Task import Task
|
||||
|
||||
class Join(Task):
|
||||
"""
|
||||
This class represents a task for synchronizing instances that were
|
||||
previously split using a conditional task, such as MultiChoice.
|
||||
It has two or more incoming branches and one or more outputs.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, split_task = None, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
split_task -- the task that was previously used to split the
|
||||
instance
|
||||
kwargs -- may contain the following keys:
|
||||
threshold -- an integer that specifies how many incoming
|
||||
branches need to complete before the task triggers.
|
||||
When the limit is reached, the task fires but still
|
||||
expects all other branches to complete.
|
||||
read from the attribute with the given name at runtime.
|
||||
cancel -- when set to True, remaining incoming branches
|
||||
are cancelled as soon as the discriminator is activated.
|
||||
"""
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.split_task = split_task
|
||||
self.threshold = kwargs.get('threshold', None)
|
||||
self.cancel_remaining = kwargs.get('cancel', False)
|
||||
|
||||
|
||||
def _branch_is_complete(self, instance):
|
||||
# Determine whether that branch is now completed by checking whether
|
||||
# it has any waiting items other than myself in it.
|
||||
skip = None
|
||||
for node in TaskInstance.Iterator(instance, instance.NOT_FINISHED_MASK):
|
||||
# If the current node is a child of myself, ignore it.
|
||||
if skip is not None and node._is_descendant_of(skip):
|
||||
continue
|
||||
if node.task == self:
|
||||
skip = node
|
||||
continue
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def _branch_may_merge_at(self, instance):
|
||||
for node in instance:
|
||||
# Ignore nodes that were created by a trigger.
|
||||
if node._has_state(TaskInstance.TRIGGERED):
|
||||
continue
|
||||
# Merge found.
|
||||
if node.task == self:
|
||||
return True
|
||||
# If the node is predicted with less outputs than he has
|
||||
# children, that means the prediction may be incomplete (for
|
||||
# example, because a prediction is not yet possible at this time).
|
||||
if not node._is_definite() \
|
||||
and len(node.task.outputs) > len(node.children):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def _fire(self, instance, waiting_nodes):
|
||||
"""
|
||||
Fire, and cancel remaining tasks, if so requested.
|
||||
"""
|
||||
# If this is a cancelling join, cancel all incoming branches,
|
||||
# except for the one that just completed.
|
||||
if self.cancel_remaining:
|
||||
for node in waiting_nodes:
|
||||
node.cancel()
|
||||
|
||||
|
||||
def _try_fire_unstructured(self, instance, force = False):
|
||||
# If the threshold was already reached, there is nothing else to do.
|
||||
if instance._has_state(TaskInstance.COMPLETED):
|
||||
return False
|
||||
if instance._has_state(TaskInstance.READY):
|
||||
return True
|
||||
|
||||
# The default threshold is the number of inputs.
|
||||
threshold = valueof(instance, self.threshold)
|
||||
if threshold is None:
|
||||
threshold = len(self.inputs)
|
||||
|
||||
# Look at the tree to find all places where this task is used.
|
||||
nodes = []
|
||||
for task in self.inputs:
|
||||
for node in instance.job.task_tree:
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if node.task != task:
|
||||
continue
|
||||
nodes.append(node)
|
||||
|
||||
# Look up which instances have already completed.
|
||||
waiting_nodes = []
|
||||
completed = 0
|
||||
for node in nodes:
|
||||
if node.parent is None or node._has_state(TaskInstance.COMPLETED):
|
||||
completed += 1
|
||||
else:
|
||||
waiting_nodes.append(node)
|
||||
|
||||
# If the threshold was reached, get ready to fire.
|
||||
if force or completed >= threshold:
|
||||
self._fire(instance, waiting_nodes)
|
||||
return True
|
||||
|
||||
# We do NOT set the instance state to COMPLETED, because in
|
||||
# case all other incoming tasks get cancelled (or never reach
|
||||
# the Join for other reasons, such as reaching a stub branch), we
|
||||
# we need to revisit it.
|
||||
return False
|
||||
|
||||
|
||||
def _try_fire_structured(self, instance, force = False):
|
||||
# If the threshold was already reached, there is nothing else to do.
|
||||
if instance._has_state(TaskInstance.READY):
|
||||
return True
|
||||
if instance._has_state(TaskInstance.COMPLETED):
|
||||
return False
|
||||
|
||||
# Retrieve a list of all activated instances from the associated
|
||||
# task that did the conditional parallel split.
|
||||
split_node = instance._find_ancestor_from_name(self.split_task)
|
||||
if split_node is None:
|
||||
msg = 'Join with %s, which was not reached' % self.split_task
|
||||
raise WorkflowException(self, msg)
|
||||
nodes = split_node.task._get_activated_instances(split_node, instance)
|
||||
|
||||
# The default threshold is the number of branches that were started.
|
||||
threshold = valueof(instance, self.threshold)
|
||||
if threshold is None:
|
||||
threshold = len(nodes)
|
||||
|
||||
# Look up which instances have already completed.
|
||||
waiting_nodes = []
|
||||
completed = 0
|
||||
for node in nodes:
|
||||
# Refresh path prediction.
|
||||
node.task._predict(node)
|
||||
|
||||
if not self._branch_may_merge_at(node):
|
||||
completed += 1
|
||||
elif self._branch_is_complete(node):
|
||||
completed += 1
|
||||
else:
|
||||
waiting_nodes.append(node)
|
||||
|
||||
# If the threshold was reached, get ready to fire.
|
||||
if force or completed >= threshold:
|
||||
self._fire(instance, waiting_nodes)
|
||||
return True
|
||||
|
||||
# We do NOT set the instance state to COMPLETED, because in
|
||||
# case all other incoming tasks get cancelled (or never reach
|
||||
# the Join for other reasons, such as reaching a stub branch), we
|
||||
# need to revisit it.
|
||||
return False
|
||||
|
||||
|
||||
def try_fire(self, instance, force = False):
|
||||
if self.split_task is None:
|
||||
return self._try_fire_unstructured(instance, force)
|
||||
return self._try_fire_structured(instance, force)
|
||||
|
||||
|
||||
def _do_join(self, instance):
|
||||
if self.split_task:
|
||||
split_task = instance.job.get_task_from_name(self.split_task)
|
||||
split_node = instance._find_ancestor(split_task)
|
||||
else:
|
||||
split_node = instance.job.task_tree
|
||||
|
||||
# Find the inbound node that was completed last.
|
||||
last_changed = None
|
||||
thread_nodes = []
|
||||
for node in split_node._find_any(self):
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if self.split_task and node._is_descendant_of(instance):
|
||||
continue
|
||||
changed = node.parent.last_state_change
|
||||
if last_changed is None \
|
||||
or changed > last_changed.parent.last_state_change:
|
||||
last_changed = node
|
||||
thread_nodes.append(node)
|
||||
|
||||
# Mark all nodes in this thread that reference this task as
|
||||
# completed, except for the first one, which should be READY.
|
||||
for node in thread_nodes:
|
||||
if node == last_changed:
|
||||
self.signal_emit('entered', instance.job, instance)
|
||||
node._ready()
|
||||
else:
|
||||
node.state = TaskInstance.COMPLETED
|
||||
node._drop_children()
|
||||
return False
|
||||
|
||||
|
||||
def _on_trigger(self, instance):
|
||||
"""
|
||||
May be called to fire the Join before the incoming branches are
|
||||
completed.
|
||||
"""
|
||||
for node in instance.job.task_tree._find_any(self):
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
return self._do_join(node)
|
||||
|
||||
|
||||
def _update_state_hook(self, instance):
|
||||
if not self.try_fire(instance):
|
||||
instance.state = TaskInstance.WAITING
|
||||
return False
|
||||
return self._do_join(instance)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,105 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import re
|
||||
from SpiffWorkflow.Operators import *
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class MultiChoice(Task):
|
||||
"""
|
||||
This class represents an if condition where multiple conditions may match
|
||||
at the same time, creating multiple instances.
|
||||
This task has one or more inputs, and one or more incoming branches.
|
||||
This task has one or more outputs.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
"""
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.cond_tasks = []
|
||||
self.choice = None
|
||||
|
||||
|
||||
def connect(self, task):
|
||||
"""
|
||||
Convenience wrapper around connect_if() where condition is set to None.
|
||||
"""
|
||||
return self.connect_if(None, task)
|
||||
|
||||
|
||||
def connect_if(self, condition, task):
|
||||
"""
|
||||
Connects a task that is executed if the condition DOES match.
|
||||
|
||||
condition -- a condition (Condition)
|
||||
task -- the conditional task
|
||||
"""
|
||||
assert task is not None
|
||||
self.outputs.append(task)
|
||||
self.cond_tasks.append((condition, task))
|
||||
task._connect_notify(self)
|
||||
|
||||
|
||||
def test(self):
|
||||
"""
|
||||
Checks whether all required attributes are set. Throws an exception
|
||||
if an error was detected.
|
||||
"""
|
||||
Task.test(self)
|
||||
if len(self.cond_tasks) < 1:
|
||||
raise WorkflowException(self, 'At least one output required.')
|
||||
for condition, task in self.cond_tasks:
|
||||
if task is None:
|
||||
raise WorkflowException(self, 'Condition with no task.')
|
||||
if condition is None:
|
||||
continue
|
||||
if condition is None:
|
||||
raise WorkflowException(self, 'Condition is None.')
|
||||
|
||||
|
||||
def _on_trigger(self, instance, choice):
|
||||
"""
|
||||
Lets a caller narrow down the choice by using a Choose trigger.
|
||||
"""
|
||||
self.choice = choice
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
instance._update_children(self.outputs, TaskInstance.MAYBE)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
# Find all matching conditions.
|
||||
outputs = []
|
||||
for condition, output in self.cond_tasks:
|
||||
if condition is not None and not condition._matches(instance):
|
||||
continue
|
||||
if self.choice is not None and output.name not in self.choice:
|
||||
continue
|
||||
outputs.append(output)
|
||||
|
||||
instance._update_children(outputs)
|
||||
return True
|
|
@ -0,0 +1,106 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from SpiffWorkflow.Operators import valueof
|
||||
from Task import Task
|
||||
|
||||
class MultiInstance(Task):
|
||||
"""
|
||||
When executed, this task performs a split on the current instance.
|
||||
The number of outgoing instances depends on the runtime value of a
|
||||
specified attribute.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
|
||||
This task has one or more inputs and may have any number of outputs.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
kwargs -- must contain one of the following:
|
||||
times -- the number of instances to create.
|
||||
"""
|
||||
assert kwargs.has_key('times')
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.times = kwargs.get('times', None)
|
||||
|
||||
|
||||
def _find_my_instance(self, instance):
|
||||
for node in instance.job.task_tree:
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if node.task == self:
|
||||
return node
|
||||
return None
|
||||
|
||||
|
||||
def _on_trigger(self, instance):
|
||||
"""
|
||||
May be called after execute() was already completed to create an
|
||||
additional outbound instance.
|
||||
"""
|
||||
# Find a TaskInstance for this task.
|
||||
my_instance = self._find_my_instance(instance)
|
||||
for output in self.outputs:
|
||||
if my_instance._has_state(TaskInstance.COMPLETED):
|
||||
state = TaskInstance.READY | TaskInstance.TRIGGERED
|
||||
else:
|
||||
state = TaskInstance.FUTURE | TaskInstance.TRIGGERED
|
||||
node = my_instance._add_child(output, state)
|
||||
output._predict(node)
|
||||
|
||||
|
||||
def _get_predicted_outputs(self, instance):
|
||||
split_n = instance._get_internal_attribute('splits', 1)
|
||||
|
||||
# Predict the outputs.
|
||||
outputs = []
|
||||
for i in range(split_n):
|
||||
outputs += self.outputs
|
||||
return outputs
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
split_n = valueof(instance, self.times)
|
||||
if split_n is None:
|
||||
return
|
||||
instance._set_internal_attribute(splits = split_n)
|
||||
|
||||
# Create the outgoing nodes.
|
||||
outputs = []
|
||||
for i in range(split_n):
|
||||
outputs += self.outputs
|
||||
|
||||
if instance._has_state(TaskInstance.LIKELY):
|
||||
child_state = TaskInstance.LIKELY
|
||||
else:
|
||||
child_state = TaskInstance.FUTURE
|
||||
instance._update_children(outputs, child_state)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
outputs = self._get_predicted_outputs(instance)
|
||||
instance._update_children(outputs)
|
||||
return True
|
|
@ -0,0 +1,52 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class ReleaseMutex(Task):
|
||||
"""
|
||||
This class implements a task that releases a mutex (lock), protecting
|
||||
a section of the workflow from being accessed by other sections.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, mutex, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
mutex -- the mutex that should be released
|
||||
"""
|
||||
assert mutex is not None
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.mutex = mutex
|
||||
|
||||
|
||||
def _on_complete_hook(self, task_instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
task_instance -- the task_instance in which this method is executed
|
||||
"""
|
||||
mutex = task_instance.job.get_mutex(self.mutex)
|
||||
mutex.unlock()
|
||||
return Task._on_complete_hook(self, task_instance)
|
|
@ -0,0 +1,59 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class StartTask(Task):
|
||||
"""
|
||||
This class implements the task the is placed at the beginning
|
||||
of each workflow. The task has no inputs and at least one output.
|
||||
If more than one output is connected, the task does an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
"""
|
||||
Task.__init__(self, parent, 'Start', **kwargs)
|
||||
|
||||
|
||||
def _connect_notify(self, task):
|
||||
"""
|
||||
Called by the previous task to let us know that it exists.
|
||||
"""
|
||||
raise WorkflowException(self, 'StartTask can not have any inputs.')
|
||||
|
||||
|
||||
def _update_state(self, instance):
|
||||
if not self._update_state_hook(instance):
|
||||
return
|
||||
self.signal_emit('entered', instance.job, instance)
|
||||
instance._ready()
|
||||
|
||||
|
||||
def test(self):
|
||||
"""
|
||||
Checks whether all required attributes are set. Throws an exception
|
||||
if an error was detected.
|
||||
"""
|
||||
if len(self.inputs) != 0:
|
||||
raise WorkflowException(self, 'StartTask with an input.')
|
||||
elif len(self.outputs) < 1:
|
||||
raise WorkflowException(self, 'No output task connected.')
|
|
@ -0,0 +1,133 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
import os.path
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from SpiffWorkflow.Operators import valueof
|
||||
from SpiffWorkflow.Storage import XmlReader
|
||||
from Task import Task
|
||||
import SpiffWorkflow.Job
|
||||
|
||||
class SubWorkflow(Task):
|
||||
"""
|
||||
A SubWorkflow is a task that wraps a Workflow, such that you can re-use it
|
||||
in multiple places as if it were a task.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
kwargs -- may contain the following keys:
|
||||
file -- name of a file containing a workflow
|
||||
the name of a workflow file
|
||||
"""
|
||||
assert parent is not None
|
||||
assert name is not None
|
||||
assert kwargs.has_key('file')
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.file = kwargs.get('file', None)
|
||||
self.in_assign = kwargs.get('in_assign', [])
|
||||
self.out_assign = kwargs.get('out_assign', [])
|
||||
if kwargs.has_key('file'):
|
||||
dirname = os.path.dirname(parent.file)
|
||||
self.file = os.path.join(dirname, kwargs['file'])
|
||||
|
||||
|
||||
def test(self):
|
||||
Task.test(self)
|
||||
if self.file is not None and not os.path.exists(self.file):
|
||||
raise WorkflowException(self, 'File does not exist: %s' % self.file)
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
outputs = [node.task for node in instance.children]
|
||||
for output in self.outputs:
|
||||
if output not in outputs:
|
||||
outputs.insert(0, output)
|
||||
if instance._has_state(TaskInstance.LIKELY):
|
||||
instance._update_children(outputs, TaskInstance.LIKELY)
|
||||
else:
|
||||
instance._update_children(outputs, TaskInstance.FUTURE)
|
||||
|
||||
|
||||
def _on_ready_before_hook(self, instance):
|
||||
file = valueof(instance, self.file)
|
||||
xml_reader = XmlReader()
|
||||
workflow_list = xml_reader.parse_file(file)
|
||||
workflow = workflow_list[0]
|
||||
outer_job = instance.job.outer_job
|
||||
subjob = SpiffWorkflow.Job(workflow, parent = outer_job)
|
||||
subjob.signal_connect('completed', self._on_subjob_completed, instance)
|
||||
|
||||
# Integrate the tree of the subjob into the tree of this job.
|
||||
instance._update_children(self.outputs, TaskInstance.FUTURE)
|
||||
for child in instance.children:
|
||||
child._inherit_attributes()
|
||||
for child in subjob.task_tree.children:
|
||||
instance.children.insert(0, child)
|
||||
child.parent = instance
|
||||
|
||||
instance._set_internal_attribute(subjob = subjob)
|
||||
return True
|
||||
|
||||
|
||||
def _on_ready_hook(self, instance):
|
||||
# Assign variables, if so requested.
|
||||
subjob = instance._get_internal_attribute('subjob')
|
||||
for child in subjob.task_tree.children:
|
||||
for assignment in self.in_assign:
|
||||
assignment.assign(instance, child)
|
||||
|
||||
self._predict(instance)
|
||||
for child in subjob.task_tree.children:
|
||||
child.task._update_state(child)
|
||||
return True
|
||||
|
||||
|
||||
def _on_subjob_completed(self, subjob, instance):
|
||||
# Assign variables, if so requested.
|
||||
for child in instance.children:
|
||||
if child.task in self.outputs:
|
||||
for assignment in self.out_assign:
|
||||
assignment.assign(subjob, child)
|
||||
|
||||
# Alright, abusing that hook and sending the signal is
|
||||
# just evil but it works.
|
||||
if not child.task._update_state_hook(child):
|
||||
return
|
||||
child.task.signal_emit('entered', child.job, child)
|
||||
child._ready()
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
for child in instance.children:
|
||||
if child.task in self.outputs:
|
||||
continue
|
||||
child.task._update_state(child)
|
||||
return True
|
|
@ -0,0 +1,358 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.Trackable import Trackable
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from SpiffWorkflow.Operators import valueof
|
||||
|
||||
class Assign(object):
|
||||
def __init__(self, left_attribute, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
kwargs -- must contain one of right_attribute/right.
|
||||
"""
|
||||
assert left_attribute is not None
|
||||
assert kwargs.has_key('right_attribute') or kwargs.has_key('right')
|
||||
self.left_attribute = left_attribute
|
||||
self.right_attribute = kwargs.get('right_attribute', None)
|
||||
self.right = kwargs.get('right', None)
|
||||
|
||||
def assign(self, from_obj, to_obj):
|
||||
# Fetch the value of the right expression.
|
||||
if self.right is not None:
|
||||
right = self.right
|
||||
else:
|
||||
right = from_obj.get_attribute(self.right_attribute)
|
||||
to_obj.set_attribute(**{str(self.left_attribute): right})
|
||||
|
||||
|
||||
class Task(Trackable):
|
||||
"""
|
||||
This class implements a task with one or more inputs and
|
||||
any number of outputs.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
|
||||
Tasks provide the following signals:
|
||||
- *entered*: called when the state changes to READY or WAITING, at a
|
||||
time where properties are not yet initialized.
|
||||
- *reached*: called when the state changes to READY or WAITING, at a
|
||||
time where properties are already initialized using property_assign
|
||||
and pre-assign.
|
||||
- *ready*: called when the state changes to READY, at a time where
|
||||
properties are already initialized using property_assign and
|
||||
pre-assign.
|
||||
- *completed*: called when the state changes to COMPLETED, at a time
|
||||
before the post-assign variables are assigned.
|
||||
- *cancelled*: called when the state changes to CANCELLED, at a time
|
||||
before the post-assign variables are assigned.
|
||||
- *finished*: called when the state changes to COMPLETED or CANCELLED,
|
||||
at the last possible time and after the post-assign variables are
|
||||
assigned.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor. May also have properties/attributes passed.
|
||||
|
||||
The difference between the assignment of a property using
|
||||
property_assign versus pre_assign and post_assign is that
|
||||
changes made using property_assign are task-local, i.e. they are
|
||||
not visible to other tasks.
|
||||
Similarly, "defines" are properties that, once defined, can no
|
||||
longer be modified.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
kwargs -- may contain the following keys:
|
||||
lock -- a list of locks that is aquired on entry of
|
||||
execute() and released on leave of execute().
|
||||
property_assign -- a list of attribute name/value pairs
|
||||
pre_assign -- a list of attribute name/value pairs
|
||||
post_assign -- a list of attribute name/value pairs
|
||||
"""
|
||||
assert parent is not None
|
||||
assert name is not None
|
||||
Trackable.__init__(self)
|
||||
self._parent = parent
|
||||
self.id = None
|
||||
self.name = str(name)
|
||||
self.description = kwargs.get('description', '')
|
||||
self.inputs = []
|
||||
self.outputs = []
|
||||
self.manual = False
|
||||
self.internal = False # Only for easing debugging.
|
||||
self.cancelled = False
|
||||
self.properties = kwargs.get('properties', {})
|
||||
self.defines = kwargs.get('defines', {})
|
||||
self.pre_assign = kwargs.get('pre_assign', [])
|
||||
self.post_assign = kwargs.get('post_assign', [])
|
||||
self.locks = kwargs.get('lock', [])
|
||||
self.lookahead = 2 # Maximum number of MAYBE predictions.
|
||||
self._parent._add_notify(self)
|
||||
self.properties.update(self.defines)
|
||||
assert self.id is not None
|
||||
|
||||
|
||||
def _connect_notify(self, task):
|
||||
"""
|
||||
Called by the previous task to let us know that it exists.
|
||||
|
||||
task -- the task by which this method is executed
|
||||
"""
|
||||
self.inputs.append(task)
|
||||
|
||||
|
||||
def _get_activated_instances(self, instance, destination):
|
||||
"""
|
||||
Returns the list of instances that were activated in the previous
|
||||
call of execute(). Only returns instances that point towards the
|
||||
destination node, i.e. those which have destination as a
|
||||
descendant.
|
||||
|
||||
instance -- the instance of this task
|
||||
destination -- the child instance
|
||||
"""
|
||||
return instance.children
|
||||
|
||||
|
||||
def _get_activated_threads(self, instance):
|
||||
"""
|
||||
Returns the list of threads that were activated in the previous
|
||||
call of execute().
|
||||
|
||||
instance -- the instance of this task
|
||||
"""
|
||||
return instance.children
|
||||
|
||||
|
||||
def set_property(self, **kwargs):
|
||||
"""
|
||||
Defines the given property name/value pairs.
|
||||
"""
|
||||
for key in kwargs:
|
||||
if self.defines.has_key(key):
|
||||
msg = "Property %s can not be modified" % key
|
||||
raise Exception.WorkflowException(msg)
|
||||
self.properties.update(kwargs)
|
||||
|
||||
|
||||
def get_property(self, name, default = None):
|
||||
"""
|
||||
Returns the value of the property with the given name, or the given
|
||||
default value if the property does not exist.
|
||||
|
||||
name -- a property name (string)
|
||||
default -- the default value that is returned if the property does
|
||||
not exist.
|
||||
"""
|
||||
return self.properties.get(name, default)
|
||||
|
||||
|
||||
def connect(self, task):
|
||||
"""
|
||||
Connect the *following* task to this one. In other words, the
|
||||
given task is added as an output task.
|
||||
|
||||
task -- the task to connect to.
|
||||
"""
|
||||
self.outputs.append(task)
|
||||
task._connect_notify(self)
|
||||
|
||||
|
||||
def test(self):
|
||||
"""
|
||||
Checks whether all required attributes are set. Throws an exception
|
||||
if an error was detected.
|
||||
"""
|
||||
if self.id is None:
|
||||
raise Exception.WorkflowException(self, 'Task is not yet instanciated.')
|
||||
if len(self.inputs) < 1:
|
||||
raise Exception.WorkflowException(self, 'No input task connected.')
|
||||
|
||||
|
||||
def _predict(self, instance, seen = None, looked_ahead = 0):
|
||||
"""
|
||||
Updates the branch such that all possible future routes are added
|
||||
with the LIKELY flag.
|
||||
|
||||
Should NOT be overwritten! Instead, overwrite the hook (_predict_hook).
|
||||
"""
|
||||
if seen is None:
|
||||
seen = []
|
||||
elif self in seen:
|
||||
return
|
||||
if not instance._is_definite():
|
||||
seen.append(self)
|
||||
if instance._has_state(TaskInstance.MAYBE):
|
||||
looked_ahead += 1
|
||||
if looked_ahead >= self.lookahead:
|
||||
return
|
||||
if not instance._is_finished():
|
||||
self._predict_hook(instance)
|
||||
for node in instance.children:
|
||||
node.task._predict(node, seen[:], looked_ahead)
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
if instance._is_definite():
|
||||
child_state = TaskInstance.FUTURE
|
||||
else:
|
||||
child_state = TaskInstance.LIKELY
|
||||
instance._update_children(self.outputs, child_state)
|
||||
|
||||
|
||||
def _update_state(self, instance):
|
||||
instance._inherit_attributes()
|
||||
if not self._update_state_hook(instance):
|
||||
return
|
||||
self.signal_emit('entered', instance.job, instance)
|
||||
instance._ready()
|
||||
|
||||
|
||||
def _update_state_hook(self, instance):
|
||||
was_predicted = instance._is_predicted()
|
||||
if not instance.parent._is_finished():
|
||||
instance.state = TaskInstance.FUTURE
|
||||
if was_predicted:
|
||||
self._predict(instance)
|
||||
if instance.parent._is_finished():
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def _on_ready(self, instance):
|
||||
"""
|
||||
Return True on success, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
assert instance is not None
|
||||
assert not self.cancelled
|
||||
self.test()
|
||||
|
||||
# Acquire locks, if any.
|
||||
for lock in self.locks:
|
||||
mutex = instance.job.get_mutex(lock)
|
||||
if not mutex.testandset():
|
||||
return False
|
||||
|
||||
# Assign variables, if so requested.
|
||||
for assignment in self.pre_assign:
|
||||
assignment.assign(instance, instance)
|
||||
|
||||
# Run task-specific code.
|
||||
result = self._on_ready_before_hook(instance)
|
||||
self.signal_emit('reached', instance.job, instance)
|
||||
if result:
|
||||
result = self._on_ready_hook(instance)
|
||||
|
||||
# Run user code, if any.
|
||||
if result:
|
||||
result = self.signal_emit('ready', instance.job, instance)
|
||||
|
||||
if result:
|
||||
# Assign variables, if so requested.
|
||||
for assignment in self.post_assign:
|
||||
assignment.assign(instance, instance)
|
||||
|
||||
# Release locks, if any.
|
||||
for lock in self.locks:
|
||||
mutex = instance.job.get_mutex(lock)
|
||||
mutex.unlock()
|
||||
return result
|
||||
|
||||
|
||||
def _on_ready_before_hook(self, instance):
|
||||
"""
|
||||
A hook into _on_ready() that does the task specific work.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
return True
|
||||
|
||||
|
||||
def _on_ready_hook(self, instance):
|
||||
"""
|
||||
A hook into _on_ready() that does the task specific work.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
return True
|
||||
|
||||
|
||||
def _on_cancel(self, instance):
|
||||
"""
|
||||
May be called by another task to cancel the operation before it was
|
||||
completed.
|
||||
|
||||
Return True on success, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
return True
|
||||
|
||||
|
||||
def _on_trigger(self, instance):
|
||||
"""
|
||||
May be called by another task to trigger a task-specific
|
||||
event.
|
||||
|
||||
Return True on success, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
raise NotImplementedError("Trigger not supported by this task.")
|
||||
|
||||
|
||||
def _on_complete(self, instance):
|
||||
"""
|
||||
Return True on success, False otherwise. Should not be overwritten,
|
||||
overwrite _on_complete_hook() instead.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
assert instance is not None
|
||||
assert not self.cancelled
|
||||
|
||||
if instance.job.debug:
|
||||
print "Executing node:", instance.get_name()
|
||||
|
||||
if not self._on_complete_hook(instance):
|
||||
return False
|
||||
|
||||
# Notify the Job.
|
||||
instance.job._instance_completed_notify(instance)
|
||||
|
||||
if instance.job.debug:
|
||||
instance.job.outer_job.task_tree.dump()
|
||||
|
||||
self.signal_emit('completed', instance.job, instance)
|
||||
return True
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
A hook into _on_complete() that does the task specific work.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
# If we have more than one output, implicitly split.
|
||||
instance._update_children(self.outputs)
|
||||
return True
|
|
@ -0,0 +1,134 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from SpiffWorkflow.Operators import valueof
|
||||
from Task import Task
|
||||
from Join import Join
|
||||
|
||||
class ThreadMerge(Join):
|
||||
"""
|
||||
This class represents a task for synchronizing instances that were
|
||||
previously split using a a ThreadSplit.
|
||||
It has two or more incoming branches and one or more outputs.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, split_task, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
split_task -- the name of the task that was previously used to split
|
||||
the instance
|
||||
kwargs -- may contain the following keys:
|
||||
threshold -- an integer that specifies how many incoming
|
||||
branches need to complete before the task triggers.
|
||||
When the limit is reached, the task fires but still
|
||||
expects all other branches to complete.
|
||||
cancel -- when set to True, remaining incoming branches
|
||||
are cancelled as soon as the discriminator is activated.
|
||||
"""
|
||||
assert split_task is not None
|
||||
Join.__init__(self, parent, name, split_task, **kwargs)
|
||||
|
||||
|
||||
def try_fire(self, instance):
|
||||
# If the threshold was already reached, there is nothing else to do.
|
||||
if instance._has_state(TaskInstance.COMPLETED):
|
||||
return False
|
||||
if instance._has_state(TaskInstance.READY):
|
||||
return True
|
||||
|
||||
# Retrieve a list of all activated instances from the associated
|
||||
# task that did the conditional parallel split.
|
||||
split_node = instance._find_ancestor_from_name(self.split_task)
|
||||
if split_node is None:
|
||||
msg = 'Join with %s, which was not reached' % self.split_task
|
||||
raise WorkflowException(self, msg)
|
||||
nodes = split_node.task._get_activated_threads(split_node)
|
||||
|
||||
# The default threshold is the number of threads that were started.
|
||||
threshold = valueof(instance, self.threshold)
|
||||
if threshold is None:
|
||||
threshold = len(nodes)
|
||||
|
||||
# Look up which instances have already completed.
|
||||
waiting_nodes = []
|
||||
completed = 0
|
||||
for node in nodes:
|
||||
# Refresh path prediction.
|
||||
node.task._predict(node)
|
||||
|
||||
if self._branch_is_complete(node):
|
||||
completed += 1
|
||||
else:
|
||||
waiting_nodes.append(node)
|
||||
|
||||
# If the threshold was reached, get ready to fire.
|
||||
if completed >= threshold:
|
||||
# If this is a cancelling join, cancel all incoming branches,
|
||||
# except for the one that just completed.
|
||||
if self.cancel_remaining:
|
||||
for node in waiting_nodes:
|
||||
node.cancel()
|
||||
return True
|
||||
|
||||
# We do NOT set the instance state to COMPLETED, because in
|
||||
# case all other incoming tasks get cancelled (or never reach
|
||||
# the ThreadMerge for other reasons, such as reaching a stub branch),
|
||||
# we need to revisit it.
|
||||
return False
|
||||
|
||||
|
||||
def _update_state_hook(self, instance):
|
||||
if not self.try_fire(instance):
|
||||
instance._set_state(TaskInstance.WAITING)
|
||||
return False
|
||||
|
||||
split_task = instance.job.get_task_from_name(self.split_task)
|
||||
split_node = instance._find_ancestor(split_task)
|
||||
|
||||
# Find the inbound node that was completed last.
|
||||
last_changed = None
|
||||
nodes = []
|
||||
for node in split_node._find_any(self):
|
||||
if self.split_task and node._is_descendant_of(instance):
|
||||
continue
|
||||
changed = node.parent.last_state_change
|
||||
if last_changed is None \
|
||||
or changed > last_changed.parent.last_state_change:
|
||||
last_changed = node
|
||||
nodes.append(node)
|
||||
|
||||
# Mark all nodes in this thread that reference this task as
|
||||
# completed, except for the first one, which should be READY.
|
||||
for node in nodes:
|
||||
if node == last_changed:
|
||||
self.signal_emit('entered', instance.job, instance)
|
||||
node._ready()
|
||||
else:
|
||||
node.state = TaskInstance.COMPLETED
|
||||
node._drop_children()
|
||||
return False
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,140 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
from ThreadStart import ThreadStart
|
||||
|
||||
class ThreadSplit(Task):
|
||||
"""
|
||||
When executed, this task performs a split on the current instance.
|
||||
The number of outgoing instances depends on the runtime value of a
|
||||
specified attribute.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
|
||||
This task has one or more inputs and may have any number of outputs.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the pattern (string)
|
||||
kwargs -- must contain one of the following:
|
||||
times -- the number of instances to create.
|
||||
times-attribute -- the name of the attribute that
|
||||
specifies the number of outgoing
|
||||
instances.
|
||||
"""
|
||||
assert kwargs.has_key('times_attribute') or kwargs.has_key('times')
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.times_attribute = kwargs.get('times_attribute', None)
|
||||
self.times = kwargs.get('times', None)
|
||||
self.thread_starter = ThreadStart(parent, **kwargs)
|
||||
self.outputs.append(self.thread_starter)
|
||||
self.thread_starter._connect_notify(self)
|
||||
|
||||
|
||||
def connect(self, task):
|
||||
"""
|
||||
Connect the *following* task to this one. In other words, the
|
||||
given task is added as an output task.
|
||||
|
||||
task -- the task to connect to.
|
||||
"""
|
||||
self.thread_starter.outputs.append(task)
|
||||
task._connect_notify(self.thread_starter)
|
||||
|
||||
|
||||
def _find_my_instance(self, job):
|
||||
for node in job.branch_tree:
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if node.task == self:
|
||||
return node
|
||||
return None
|
||||
|
||||
|
||||
def _get_activated_instances(self, instance, destination):
|
||||
"""
|
||||
Returns the list of instances that were activated in the previous
|
||||
call of execute(). Only returns instances that point towards the
|
||||
destination node, i.e. those which have destination as a
|
||||
descendant.
|
||||
|
||||
instance -- the instance of this task
|
||||
destination -- the child instance
|
||||
"""
|
||||
node = destination._find_ancestor(self.thread_starter)
|
||||
return self.thread_starter._get_activated_instances(node, destination)
|
||||
|
||||
|
||||
def _get_activated_threads(self, instance):
|
||||
"""
|
||||
Returns the list of threads that were activated in the previous
|
||||
call of execute().
|
||||
|
||||
instance -- the instance of this task
|
||||
"""
|
||||
return instance.children
|
||||
|
||||
|
||||
def _on_trigger(self, instance):
|
||||
"""
|
||||
May be called after execute() was already completed to create an
|
||||
additional outbound instance.
|
||||
"""
|
||||
# Find a TaskInstance for this task.
|
||||
my_instance = self._find_my_instance(instance.job)
|
||||
for output in self.outputs:
|
||||
state = TaskInstance.READY | TaskInstance.TRIGGERED
|
||||
new_instance = my_instance.add_child(output, state)
|
||||
|
||||
|
||||
def _predict_hook(self, instance):
|
||||
split_n = instance.get_attribute('split_n', self.times)
|
||||
if split_n is None:
|
||||
split_n = instance.get_attribute(self.times_attribute, 1)
|
||||
|
||||
# Predict the outputs.
|
||||
outputs = []
|
||||
for i in range(split_n):
|
||||
outputs.append(self.thread_starter)
|
||||
if instance._is_definite():
|
||||
child_state = TaskInstance.FUTURE
|
||||
else:
|
||||
child_state = TaskInstance.LIKELY
|
||||
instance._update_children(outputs, child_state)
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
# Split, and remember the number of splits in the context data.
|
||||
split_n = self.times
|
||||
if split_n is None:
|
||||
split_n = instance.get_attribute(self.times_attribute)
|
||||
|
||||
# Create the outgoing nodes.
|
||||
outputs = []
|
||||
for i in range(split_n):
|
||||
outputs.append(self.thread_starter)
|
||||
instance._update_children(outputs)
|
||||
return True
|
|
@ -0,0 +1,46 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class ThreadStart(Task):
|
||||
"""
|
||||
This class implements the task the is placed at the beginning
|
||||
of each thread. It is NOT supposed to be used by in the API, it is
|
||||
used internally only (by the ThreadSplit task).
|
||||
The task has no inputs and at least one output.
|
||||
If more than one output is connected, the task does an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
"""
|
||||
Task.__init__(self, parent, 'ThreadStart', **kwargs)
|
||||
self.internal = True
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
"""
|
||||
instance._assign_new_thread_id()
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,76 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from SpiffWorkflow.TaskInstance import TaskInstance
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
from Task import Task
|
||||
|
||||
class Trigger(Task):
|
||||
"""
|
||||
This class implements a task that triggers an event on another
|
||||
task.
|
||||
If more than one input is connected, the task performs an implicit
|
||||
multi merge.
|
||||
If more than one output is connected, the task performs an implicit
|
||||
parallel split.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, name, context, **kwargs):
|
||||
"""
|
||||
Constructor.
|
||||
|
||||
parent -- a reference to the parent (Task)
|
||||
name -- a name for the task (string)
|
||||
context -- a list of the names of tasks that are to be triggered
|
||||
"""
|
||||
assert parent is not None
|
||||
assert name is not None
|
||||
assert context is not None
|
||||
assert type(context) == type([])
|
||||
Task.__init__(self, parent, name, **kwargs)
|
||||
self.context = context
|
||||
self.times = kwargs.get('times', 1)
|
||||
self.queued = 0
|
||||
|
||||
|
||||
def _on_trigger(self, instance):
|
||||
"""
|
||||
Enqueue a trigger, such that this tasks triggers multiple times later
|
||||
when _on_complete() is called.
|
||||
"""
|
||||
self.queued += 1
|
||||
# All instances that have already completed need to be put into
|
||||
# READY again.
|
||||
for node in instance.job.task_tree:
|
||||
if node.thread_id != instance.thread_id:
|
||||
continue
|
||||
if node.task == self and node._has_state(TaskInstance.COMPLETED):
|
||||
node.state = TaskInstance.FUTURE
|
||||
node._ready()
|
||||
|
||||
|
||||
def _on_complete_hook(self, instance):
|
||||
"""
|
||||
Runs the task. Should not be called directly.
|
||||
Returns True if completed, False otherwise.
|
||||
|
||||
instance -- the instance in which this method is executed
|
||||
"""
|
||||
for i in range(self.times + self.queued):
|
||||
for task_name in self.context:
|
||||
task = instance.job.get_task_from_name(task_name)
|
||||
task._on_trigger(instance)
|
||||
self.queued = 0
|
||||
return Task._on_complete_hook(self, instance)
|
|
@ -0,0 +1,20 @@
|
|||
from AcquireMutex import AcquireMutex
|
||||
from CancelJob import CancelJob
|
||||
from CancelTask import CancelTask
|
||||
from Choose import Choose
|
||||
from ExclusiveChoice import ExclusiveChoice
|
||||
from Gate import Gate
|
||||
from Join import Join
|
||||
from MultiChoice import MultiChoice
|
||||
from MultiInstance import MultiInstance
|
||||
from ReleaseMutex import ReleaseMutex
|
||||
from StartTask import StartTask
|
||||
from SubWorkflow import SubWorkflow
|
||||
from Task import Task, Assign
|
||||
from ThreadMerge import ThreadMerge
|
||||
from ThreadSplit import ThreadSplit
|
||||
from Trigger import Trigger
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,69 @@
|
|||
# Copyright (C) 2007 Samuel Abels, http://debain.org
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2, as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software
|
||||
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
|
||||
class Slot(object):
|
||||
def __init__(self):
|
||||
self.subscribers = []
|
||||
|
||||
def subscribe(self, user_func, *user_args):
|
||||
self.subscribers.append((user_func, user_args))
|
||||
|
||||
def is_subscribed(self, user_func):
|
||||
return user_func in [pair[0] for pair in self.subscribers]
|
||||
|
||||
def unsubscribe(self, user_func):
|
||||
remove = []
|
||||
for i, (func, user_args) in enumerate(self.subscribers):
|
||||
if func == user_func:
|
||||
remove.append(i)
|
||||
for i in remove:
|
||||
del self.subscribers[i]
|
||||
|
||||
def n_subscribers(self):
|
||||
return len(self.subscribers)
|
||||
|
||||
def signal_emit(self, *args, **kwargs):
|
||||
for func, user_args in self.subscribers:
|
||||
func(*args + user_args, **kwargs)
|
||||
|
||||
|
||||
class Trackable(object):
|
||||
def __init__(self):
|
||||
self.slots = {}
|
||||
|
||||
def signal_connect(self, name, func, *args):
|
||||
if not self.slots.has_key(name):
|
||||
self.slots[name] = Slot()
|
||||
self.slots[name].subscribe(func, *args)
|
||||
|
||||
def signal_is_connected(self, name, func):
|
||||
if not self.slots.has_key(name):
|
||||
return False
|
||||
return self.slots[name].is_subscribed(func)
|
||||
|
||||
def signal_disconnect(self, name, func):
|
||||
if not self.slots.has_key(name):
|
||||
return
|
||||
self.slots[name].unsubscribe(func)
|
||||
|
||||
def signal_subscribers(self, name):
|
||||
if not self.slots.has_key(name):
|
||||
return 0
|
||||
return self.slots[name].n_subscribers()
|
||||
|
||||
def signal_emit(self, name, *args, **kwargs):
|
||||
if not self.slots.has_key(name):
|
||||
return
|
||||
self.slots[name].signal_emit(*args, **kwargs)
|
|
@ -0,0 +1,39 @@
|
|||
# Copyright (C) 2007 Samuel Abels
|
||||
#
|
||||
# This library is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU Lesser General Public
|
||||
# License as published by the Free Software Foundation; either
|
||||
# version 2.1 of the License, or (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# Lesser General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public
|
||||
# License along with this library; if not, write to the Free Software
|
||||
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
from Tasks import StartTask
|
||||
|
||||
class Workflow(object):
|
||||
"""
|
||||
This class represents an entire workflow.
|
||||
"""
|
||||
|
||||
def __init__(self, name = '', filename = None):
|
||||
"""
|
||||
Constructor.
|
||||
"""
|
||||
self.name = name
|
||||
self.description = ''
|
||||
self.file = filename
|
||||
self.tasks = {}
|
||||
self.start = StartTask(self)
|
||||
|
||||
|
||||
def _add_notify(self, task):
|
||||
"""
|
||||
Called by a task when it was added into the workflow.
|
||||
"""
|
||||
self.tasks[task.name] = task
|
||||
task.id = len(self.tasks)
|
|
@ -0,0 +1,8 @@
|
|||
from Job import Job
|
||||
from Workflow import Workflow
|
||||
from Exception import WorkflowException
|
||||
from TaskInstance import TaskInstance
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,5 @@
|
|||
import SpiffWorkflow
|
||||
|
||||
import inspect
|
||||
__all__ = [name for name, obj in locals().items()
|
||||
if not (name.startswith('_') or inspect.ismodule(obj))]
|
|
@ -0,0 +1,87 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testParseString', 'testParseFile', 'testRunWorkflow']
|
||||
return unittest.TestSuite(map(OpenWfeXmlReaderTest, tests))
|
||||
|
||||
from WorkflowTest import on_reached_cb, \
|
||||
on_complete_cb, \
|
||||
assert_same_path
|
||||
from SpiffWorkflow import Job
|
||||
from SpiffWorkflow.Storage import OpenWfeXmlReader
|
||||
from xml.parsers.expat import ExpatError
|
||||
|
||||
class OpenWfeXmlReaderTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.reader = OpenWfeXmlReader()
|
||||
self.taken_path = []
|
||||
|
||||
|
||||
def on_reached_cb(self, job, instance):
|
||||
on_reached_cb(job, instance, [])
|
||||
instance.set_attribute(test_attribute1 = 'false')
|
||||
instance.set_attribute(test_attribute2 = 'true')
|
||||
return True
|
||||
|
||||
|
||||
def testParseString(self):
|
||||
self.assertRaises(ExpatError,
|
||||
self.reader.parse_string,
|
||||
'')
|
||||
self.reader.parse_string('<xml></xml>')
|
||||
|
||||
|
||||
def testParseFile(self):
|
||||
# File not found.
|
||||
self.assertRaises(IOError,
|
||||
self.reader.parse_file,
|
||||
'foo')
|
||||
|
||||
# 0 byte sized file.
|
||||
self.assertRaises(ExpatError,
|
||||
self.reader.parse_file,
|
||||
os.path.join(os.path.dirname(__file__), 'xml/empty1.xml'))
|
||||
|
||||
# File containing only "<xml></xml>".
|
||||
self.reader.parse_file(os.path.join(os.path.dirname(__file__), 'xml/empty2.xml'))
|
||||
|
||||
# Read a complete workflow.
|
||||
self.reader.parse_file(os.path.join(os.path.dirname(__file__), 'xml/openwfe/workflow1.xml'))
|
||||
|
||||
|
||||
def testRunWorkflow(self):
|
||||
wf = self.reader.parse_file(os.path.join(os.path.dirname(__file__), 'xml/openwfe/workflow1.xml'))
|
||||
|
||||
for name in wf[0].tasks:
|
||||
wf[0].tasks[name].signal_connect('reached', self.on_reached_cb)
|
||||
wf[0].tasks[name].signal_connect('completed', on_complete_cb, self.taken_path)
|
||||
|
||||
job = Job(wf[0])
|
||||
try:
|
||||
job.complete_all()
|
||||
except:
|
||||
job.dump()
|
||||
raise
|
||||
|
||||
path = [( 1, 'Start'),
|
||||
( 2, 'concurrence_1'),
|
||||
( 3, 'task_a1'),
|
||||
( 4, 'task_a2'),
|
||||
( 5, 'if_condition_1'),
|
||||
( 6, 'task_a3'),
|
||||
( 7, 'if_condition_1_end'),
|
||||
( 8, 'if_condition_2'),
|
||||
( 9, 'task_a5'),
|
||||
(10, 'if_condition_2_end'),
|
||||
( 3, 'task_b1'),
|
||||
( 4, 'task_b2'),
|
||||
( 5, 'concurrence_1_end'),
|
||||
( 6, 'task_c1'),
|
||||
( 7, 'task_c2'),
|
||||
( 8, 'End')]
|
||||
|
||||
assert_same_path(self, path, self.taken_path)
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,158 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testPattern']
|
||||
return unittest.TestSuite(map(PatternTest, tests))
|
||||
|
||||
from SpiffWorkflow.Tasks import *
|
||||
from SpiffWorkflow import Workflow, Job, TaskInstance
|
||||
from SpiffWorkflow.Storage import XmlReader
|
||||
from xml.parsers.expat import ExpatError
|
||||
|
||||
|
||||
def on_reached_cb(job, task, taken_path):
|
||||
reached_key = "%s_reached" % str(task.get_name())
|
||||
n_reached = task.get_attribute(reached_key, 0) + 1
|
||||
task.set_attribute(**{reached_key: n_reached,
|
||||
'two': 2,
|
||||
'three': 3,
|
||||
'test_attribute1': 'false',
|
||||
'test_attribute2': 'true'})
|
||||
|
||||
# Collect a list of all attributes.
|
||||
atts = []
|
||||
for key, value in task.get_attributes().iteritems():
|
||||
if key in ['data',
|
||||
'two',
|
||||
'three',
|
||||
'test_attribute1',
|
||||
'test_attribute2']:
|
||||
continue
|
||||
if key.endswith('reached'):
|
||||
continue
|
||||
atts.append('='.join((key, str(value))))
|
||||
|
||||
# Collect a list of all task properties.
|
||||
props = []
|
||||
for key, value in task.get_properties().iteritems():
|
||||
props.append('='.join((key, str(value))))
|
||||
#print "REACHED:", task.get_name(), atts, props
|
||||
|
||||
# Store the list of attributes and properties in the job.
|
||||
atts = ';'.join(atts)
|
||||
props = ';'.join(props)
|
||||
old = task.get_attribute('data', '')
|
||||
data = task.get_name() + ': ' + atts + '/' + props + '\n'
|
||||
task.set_attribute(data = old + data)
|
||||
#print task.get_attributes()
|
||||
|
||||
# In workflows that load a subworkflow, the newly loaded children
|
||||
# will not have on_ready_cb() assigned. By using this function, we
|
||||
# re-assign the function in every step, thus making sure that new
|
||||
# children also call on_ready_cb().
|
||||
for child in task.children:
|
||||
if not child.task.signal_is_connected('reached', on_reached_cb):
|
||||
child.task.signal_connect('reached', on_reached_cb, taken_path)
|
||||
if not child.task.signal_is_connected('completed', on_complete_cb):
|
||||
child.task.signal_connect('completed', on_complete_cb, taken_path)
|
||||
return True
|
||||
|
||||
|
||||
def on_complete_cb(job, task, taken_path):
|
||||
# Record the path in an attribute.
|
||||
indent = ' ' * (task._get_depth() - 1)
|
||||
taken_path.append('%s%s' % (indent, task.get_name()))
|
||||
#print "COMPLETED:", task.get_name(), task.get_attributes()
|
||||
return True
|
||||
|
||||
|
||||
class PatternTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.xml_path = ['xml/spiff/control-flow/',
|
||||
'xml/spiff/data/',
|
||||
'xml/spiff/resource/']
|
||||
self.reader = XmlReader()
|
||||
self.wf = None
|
||||
|
||||
|
||||
def testPattern(self):
|
||||
for dirname in self.xml_path:
|
||||
dirname = os.path.join(os.path.dirname(__file__), dirname)
|
||||
for filename in os.listdir(dirname):
|
||||
if not filename.endswith('.xml'):
|
||||
continue
|
||||
self.testFile(os.path.join(dirname, filename))
|
||||
|
||||
|
||||
def testFile(self, xml_filename):
|
||||
try:
|
||||
#print '\n%s: ok' % xml_filename,
|
||||
workflow_list = self.reader.parse_file(xml_filename)
|
||||
self.testWorkflow(workflow_list[0], xml_filename)
|
||||
except:
|
||||
print '%s:' % xml_filename
|
||||
raise
|
||||
|
||||
|
||||
def testWorkflow(self, wf, xml_filename):
|
||||
taken_path = []
|
||||
for name in wf.tasks:
|
||||
wf.tasks[name].signal_connect('reached', on_reached_cb, taken_path)
|
||||
wf.tasks[name].signal_connect('completed', on_complete_cb, taken_path)
|
||||
|
||||
# Execute all tasks within the Job.
|
||||
job = Job(wf)
|
||||
self.assert_(not job.is_completed(), 'Job is complete before start')
|
||||
try:
|
||||
job.complete_all(False)
|
||||
except:
|
||||
job.task_tree.dump()
|
||||
raise
|
||||
|
||||
#job.task_tree.dump()
|
||||
self.assert_(job.is_completed(),
|
||||
'complete_all() returned, but job is not complete\n'
|
||||
+ job.task_tree.get_dump())
|
||||
|
||||
# Make sure that there are no waiting tasks left in the tree.
|
||||
for node in TaskInstance.Iterator(job.task_tree, TaskInstance.READY):
|
||||
job.task_tree.dump()
|
||||
raise Exception('Node with state READY: %s' % node.name)
|
||||
|
||||
# Check whether the correct route was taken.
|
||||
filename = xml_filename + '.path'
|
||||
if os.path.exists(filename):
|
||||
file = open(filename, 'r')
|
||||
expected = file.read()
|
||||
file.close()
|
||||
taken_path = '\n'.join(taken_path) + '\n'
|
||||
error = '%s:\n' % name
|
||||
error += 'Expected:\n'
|
||||
error += '%s\n' % expected
|
||||
error += 'but got:\n'
|
||||
error += '%s\n' % taken_path
|
||||
self.assert_(taken_path == expected, error)
|
||||
|
||||
# Check attribute availibility.
|
||||
filename = xml_filename + '.data'
|
||||
if os.path.exists(filename):
|
||||
file = open(filename, 'r')
|
||||
expected = file.read()
|
||||
file.close()
|
||||
result = job.get_attribute('data', '')
|
||||
error = '%s:\n' % name
|
||||
error += 'Expected:\n'
|
||||
error += '%s\n' % expected
|
||||
error += 'but got:\n'
|
||||
error += '%s\n' % result
|
||||
self.assert_(result == expected, error)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if len(sys.argv) == 2:
|
||||
test = PatternTest('testFile')
|
||||
test.setUp()
|
||||
test.testFile(sys.argv[1])
|
||||
sys.exit(0)
|
||||
unittest.TextTestRunner(verbosity = 1).run(suite())
|
|
@ -0,0 +1,89 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testPickle']
|
||||
return unittest.TestSuite(map(PersistenceTest, tests))
|
||||
|
||||
import pickle, pprint
|
||||
from random import randint
|
||||
from WorkflowTest import WorkflowTest, \
|
||||
on_reached_cb, \
|
||||
on_complete_cb, \
|
||||
assert_same_path
|
||||
from SpiffWorkflow import Job
|
||||
from SpiffWorkflow.Storage import XmlReader
|
||||
|
||||
class PersistenceTest(WorkflowTest):
|
||||
def setUp(self):
|
||||
WorkflowTest.setUp(self)
|
||||
self.reader = XmlReader()
|
||||
self.data_file = 'data.pkl'
|
||||
self.taken_path = None
|
||||
|
||||
|
||||
def testPickleSingle(self, workflow, job):
|
||||
self.taken_path = {'reached': [],
|
||||
'completed': []}
|
||||
for name, task in workflow.tasks.iteritems():
|
||||
task.signal_connect('reached',
|
||||
on_reached_cb,
|
||||
self.taken_path['reached'])
|
||||
task.signal_connect('completed',
|
||||
on_complete_cb,
|
||||
self.taken_path['completed'])
|
||||
|
||||
# Execute a random number of steps.
|
||||
for i in xrange(randint(0, len(workflow.tasks))):
|
||||
job.complete_next()
|
||||
|
||||
# Store the workflow instance in a file.
|
||||
output = open(self.data_file, 'wb')
|
||||
pickle.dump(job, output, -1)
|
||||
output.close()
|
||||
before = job.get_dump()
|
||||
|
||||
# Load the workflow instance from a file and delete the file.
|
||||
input = open(self.data_file, 'rb')
|
||||
job = pickle.load(input)
|
||||
input.close()
|
||||
os.remove(self.data_file)
|
||||
after = job.get_dump()
|
||||
|
||||
# Make sure that the state of the job did not change.
|
||||
self.assert_(before == after, 'Before:\n' + before + '\n' \
|
||||
+ 'After:\n' + after + '\n')
|
||||
|
||||
# Re-connect signals, because the pickle dump now only contains a
|
||||
# copy of self.taken_path.
|
||||
for name, task in job.workflow.tasks.iteritems():
|
||||
task.signal_disconnect('reached', on_reached_cb)
|
||||
task.signal_disconnect('completed', on_complete_cb)
|
||||
task.signal_connect('reached',
|
||||
on_reached_cb,
|
||||
self.taken_path['reached'])
|
||||
task.signal_connect('completed',
|
||||
on_complete_cb,
|
||||
self.taken_path['completed'])
|
||||
|
||||
# Run the rest of the workflow.
|
||||
job.complete_all()
|
||||
after = job.get_dump()
|
||||
self.assert_(job.is_completed(), 'Job done, but not complete:' + after)
|
||||
assert_same_path(self,
|
||||
self.expected_path,
|
||||
self.taken_path['completed'])
|
||||
|
||||
|
||||
def testPickle(self):
|
||||
# Read a complete workflow.
|
||||
file = os.path.join(os.path.dirname(__file__), 'xml/spiff/workflow1.xml')
|
||||
|
||||
for i in xrange(5):
|
||||
workflow_list = self.reader.parse_file(file)
|
||||
wf = workflow_list[0]
|
||||
job = Job(wf)
|
||||
self.testPickleSingle(wf, job)
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,79 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testInstall',
|
||||
'testJob',
|
||||
'testTask',
|
||||
'testWorkflow']
|
||||
return unittest.TestSuite(map(DBTest, tests))
|
||||
|
||||
import MySQLdb
|
||||
from ConfigParser import RawConfigParser
|
||||
from sqlalchemy import *
|
||||
from sqlalchemy.orm import clear_mappers
|
||||
from SpiffWorkflow.Server import WorkflowInfo, JobInfo, TaskInfo
|
||||
import SpiffWorkflow.Server
|
||||
|
||||
class DBTest(unittest.TestCase):
|
||||
def connectDB(self):
|
||||
# Read config.
|
||||
cfg = RawConfigParser()
|
||||
cfg.read(os.path.join(os.path.dirname(__file__), 'unit_test.cfg'))
|
||||
host = cfg.get('database', 'host')
|
||||
db_name = cfg.get('database', 'db_name')
|
||||
user = cfg.get('database', 'user')
|
||||
password = cfg.get('database', 'password')
|
||||
|
||||
# Connect to MySQL.
|
||||
auth = user + ':' + password
|
||||
dbn = 'mysql://' + auth + '@' + host + '/' + db_name
|
||||
self.engine = create_engine(dbn)
|
||||
clear_mappers()
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.connectDB()
|
||||
self.db = SpiffWorkflow.Server.DB(self.engine)
|
||||
|
||||
|
||||
def testInstall(self):
|
||||
self.assert_(self.db.uninstall())
|
||||
self.assert_(self.db.install())
|
||||
self.assert_(self.db.clear_database())
|
||||
self.assert_(self.db.uninstall())
|
||||
self.assert_(self.db.install())
|
||||
|
||||
|
||||
def testWorkflow(self):
|
||||
self.assert_(len(self.db.get_workflow_info(id = 1)) == 0)
|
||||
obj = WorkflowInfo('my/handle')
|
||||
self.db.save(obj)
|
||||
assert obj.id >= 0
|
||||
self.assert_(len(self.db.get_workflow_info(id = obj.id)) == 1)
|
||||
self.db.delete(obj)
|
||||
self.assert_(len(self.db.get_workflow_info(id = obj.id)) == 0)
|
||||
|
||||
|
||||
def testJob(self):
|
||||
self.assert_(len(self.db.get_job_info(id = 1)) == 0)
|
||||
obj = JobInfo()
|
||||
self.db.save(obj)
|
||||
assert obj.id >= 0
|
||||
self.assert_(len(self.db.get_job_info(id = obj.id)) == 1)
|
||||
self.db.delete(obj)
|
||||
self.assert_(len(self.db.get_job_info(id = obj.id)) == 0)
|
||||
|
||||
|
||||
def testTask(self):
|
||||
self.assert_(len(self.db.get_task_info(id = 1)) == 0)
|
||||
obj = TaskInfo()
|
||||
self.db.save(obj)
|
||||
assert obj.id >= 0
|
||||
self.assert_(len(self.db.get_task_info(id = obj.id)) == 1)
|
||||
self.db.delete(obj)
|
||||
self.assert_(len(self.db.get_task_info(id = obj.id)) == 0)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,56 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testInstall', 'testDriver']
|
||||
return unittest.TestSuite(map(DriverTest, tests))
|
||||
|
||||
from DBTest import DBTest
|
||||
from SpiffWorkflow.Server import Driver, WorkflowInfo, TaskInfo
|
||||
from SpiffWorkflow.Server.Exceptions import WorkflowServerException
|
||||
|
||||
class DriverTest(DBTest):
|
||||
def setUp(self):
|
||||
self.connectDB()
|
||||
self.driver = Driver(self.engine)
|
||||
|
||||
|
||||
def testInstall(self):
|
||||
self.assert_(self.driver.uninstall())
|
||||
self.assert_(self.driver.install())
|
||||
self.assert_(self.driver.uninstall())
|
||||
self.assert_(self.driver.install())
|
||||
|
||||
|
||||
def testDriver(self):
|
||||
self.assert_(self.driver is not None)
|
||||
|
||||
# Create a workflow.
|
||||
file = os.path.join(os.path.dirname(__file__), 'parallel_split.xml')
|
||||
workflow_info = WorkflowInfo('my/workflow', file = file)
|
||||
self.assertRaises(WorkflowServerException,
|
||||
self.driver.create_job,
|
||||
workflow_info)
|
||||
self.driver.save_workflow_info(workflow_info)
|
||||
self.assert_(workflow_info.id >= 0)
|
||||
|
||||
# Instantiate the workflow.
|
||||
job_info = self.driver.create_job(workflow_info)
|
||||
self.assert_(job_info.id >= 0)
|
||||
|
||||
# Retrieve a list of tasks.
|
||||
task_info_list = self.driver.get_task_info(job_id = job_info.id)
|
||||
self.assert_(len(task_info_list) == 10)
|
||||
task_info_list = self.driver.get_task_info(job_id = job_info.id,
|
||||
status = TaskInfo.WAITING)
|
||||
self.assert_(len(task_info_list) == 1)
|
||||
|
||||
# Execute a few tasks.
|
||||
self.driver.execute_task(task_info_list[0])
|
||||
task_info_list = self.driver.get_task_info(job_id = job_info.id,
|
||||
status = TaskInfo.WAITING)
|
||||
self.assert_(len(task_info_list) == 1)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,15 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testJobInfo']
|
||||
return unittest.TestSuite(map(JobInfoTest, tests))
|
||||
|
||||
from SpiffWorkflow.Server import JobInfo
|
||||
|
||||
class JobInfoTest(unittest.TestCase):
|
||||
def testJobInfo(self):
|
||||
info = JobInfo()
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,15 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testTaskInfo']
|
||||
return unittest.TestSuite(map(TaskInfoTest, tests))
|
||||
|
||||
from SpiffWorkflow.Server import TaskInfo
|
||||
|
||||
class TaskInfoTest(unittest.TestCase):
|
||||
def testTaskInfo(self):
|
||||
info = TaskInfo()
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,16 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testWorkflowInfo']
|
||||
return unittest.TestSuite(map(WorkflowInfoTest, tests))
|
||||
|
||||
from SpiffWorkflow.Server import WorkflowInfo
|
||||
|
||||
class WorkflowInfoTest(unittest.TestCase):
|
||||
def testWorkflowInfo(self):
|
||||
info = WorkflowInfo('my/handle')
|
||||
self.assert_(info.handle == 'my/handle')
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,30 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 2 (Parallel Split)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<task name="first">
|
||||
<successor>task_f1</successor>
|
||||
<successor>task_f2</successor>
|
||||
<successor>task_f3</successor>
|
||||
</task>
|
||||
|
||||
<task name="task_f1">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
<task name="task_f2">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
<task name="task_f3">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/python
|
||||
import os, sys, unittest
|
||||
|
||||
modules = ['DBTest',
|
||||
'DriverTest',
|
||||
'JobInfoTest',
|
||||
'TaskInfoTest',
|
||||
'WorkflowInfoTest']
|
||||
|
||||
# Parse CLI options.
|
||||
if len(sys.argv) == 1:
|
||||
verbosity = 2
|
||||
elif len(sys.argv) == 2:
|
||||
verbosity = int(sys.argv[1])
|
||||
else:
|
||||
print 'Syntax:', sys.argv[0], '[verbosity]'
|
||||
print 'Default verbosity is 2'
|
||||
sys.exit(1)
|
||||
|
||||
# Load all test suites.
|
||||
all_suites = []
|
||||
for name in modules:
|
||||
module = __import__(name, globals(), locals(), [''])
|
||||
all_suites.append(module.suite())
|
||||
|
||||
# Run.
|
||||
suite = unittest.TestSuite(all_suites)
|
||||
unittest.TextTestRunner(verbosity = verbosity).run(suite)
|
|
@ -0,0 +1,5 @@
|
|||
[database]
|
||||
host: localhost
|
||||
db_name: my_db_name
|
||||
user: my_db_user
|
||||
password: my_pwd
|
|
@ -0,0 +1,71 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testTree']
|
||||
return unittest.TestSuite(map(TaskInstanceTest, tests))
|
||||
|
||||
from SpiffWorkflow import Workflow, TaskInstance
|
||||
from SpiffWorkflow.Tasks import Task
|
||||
from SpiffWorkflow.Exception import WorkflowException
|
||||
|
||||
class TaskInstanceTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
pass
|
||||
|
||||
|
||||
def testTree(self):
|
||||
# Build a tree.
|
||||
wf = Workflow()
|
||||
task1 = Task(wf, 'Task 1')
|
||||
task2 = Task(wf, 'Task 2')
|
||||
task3 = Task(wf, 'Task 3')
|
||||
task4 = Task(wf, 'Task 4')
|
||||
task5 = Task(wf, 'Task 5')
|
||||
task6 = Task(wf, 'Task 6')
|
||||
task7 = Task(wf, 'Task 7')
|
||||
task8 = Task(wf, 'Task 8')
|
||||
task9 = Task(wf, 'Task 9')
|
||||
root = TaskInstance(object, task1)
|
||||
c1 = root._add_child(task2)
|
||||
c11 = c1._add_child(task3)
|
||||
c111 = c11._add_child(task4)
|
||||
c1111 = TaskInstance(object, task5, c111)
|
||||
c112 = TaskInstance(object, task6, c11)
|
||||
c12 = TaskInstance(object, task7, c1)
|
||||
c2 = TaskInstance(object, task8, root)
|
||||
c3 = TaskInstance(object, task9, root)
|
||||
c3.state = TaskInstance.COMPLETED
|
||||
|
||||
# Check whether the tree is built properly.
|
||||
expected = """1/0: TaskInstance of Task 1 State: FUTURE Children: 3
|
||||
2/0: TaskInstance of Task 2 State: FUTURE Children: 2
|
||||
3/0: TaskInstance of Task 3 State: FUTURE Children: 2
|
||||
4/0: TaskInstance of Task 4 State: FUTURE Children: 1
|
||||
5/0: TaskInstance of Task 5 State: FUTURE Children: 0
|
||||
6/0: TaskInstance of Task 6 State: FUTURE Children: 0
|
||||
7/0: TaskInstance of Task 7 State: FUTURE Children: 0
|
||||
8/0: TaskInstance of Task 8 State: FUTURE Children: 0
|
||||
9/0: TaskInstance of Task 9 State: COMPLETED Children: 0"""
|
||||
self.assert_(expected == root.get_dump(),
|
||||
'Expected:\n' + repr(expected) + '\n' + \
|
||||
'but got:\n' + repr(root.get_dump()))
|
||||
|
||||
# Now remove one line from the expected output for testing the
|
||||
# filtered iterator.
|
||||
expected2 = ''
|
||||
for line in expected.split('\n'):
|
||||
if line.find('Task 9') >= 0:
|
||||
continue
|
||||
expected2 += line.lstrip() + '\n'
|
||||
|
||||
# Run the iterator test.
|
||||
result = ''
|
||||
for node in TaskInstance.Iterator(root, TaskInstance.FUTURE):
|
||||
result += node.get_dump(0, False) + '\n'
|
||||
self.assert_(expected2 == result,
|
||||
'Expected:\n' + expected2 + '\n' + \
|
||||
'but got:\n' + result)
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,264 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testWorkflow']
|
||||
return unittest.TestSuite(map(WorkflowTest, tests))
|
||||
|
||||
from SpiffWorkflow import Workflow, Job
|
||||
from SpiffWorkflow.Tasks import *
|
||||
from SpiffWorkflow.Operators import *
|
||||
|
||||
|
||||
def append_step(path, task, signal_name):
|
||||
path.append((task._get_depth(), task.get_name()))
|
||||
#print task._get_depth(), '.', signal_name, task.get_name(), len(path)
|
||||
|
||||
|
||||
def on_reached_cb(job, task, taken_path):
|
||||
append_step(taken_path, task, 'reached')
|
||||
reached_key = '%s_reached' % task.get_name()
|
||||
n_reached = task.get_attribute(reached_key, 0) + 1
|
||||
step = task.get_attribute('step', 1) + 1
|
||||
task.set_attribute(**{reached_key: n_reached})
|
||||
task.set_attribute(two = 2)
|
||||
task.set_attribute(three = 3)
|
||||
task.set_attribute(step = step)
|
||||
task.set_attribute(test_attribute1 = 'false')
|
||||
task.set_attribute(test_attribute2 = 'true')
|
||||
|
||||
|
||||
def on_complete_cb(job, task, taken_path):
|
||||
append_step(taken_path, task, 'completed')
|
||||
return True
|
||||
|
||||
|
||||
def format_path(path):
|
||||
'''
|
||||
Format a path for printing.
|
||||
|
||||
path -- list containing tuples.
|
||||
'''
|
||||
string = ''
|
||||
for i, (depth, name) in enumerate(path):
|
||||
string += '%2s.%sBranch %s: %s\n' % (i + 1, ' '*depth, depth, name)
|
||||
return string
|
||||
|
||||
|
||||
def assert_same_path(test, expected_path, taken_path):
|
||||
expected = format_path(expected_path)
|
||||
taken = format_path(taken_path)
|
||||
error = 'Expected:\n'
|
||||
error += '%s\n' % expected
|
||||
error += 'but got:\n'
|
||||
error += '%s\n' % taken
|
||||
|
||||
# Check whether the correct route was taken.
|
||||
for i, (depth, name) in enumerate(expected_path):
|
||||
test.assert_(i < len(taken_path), error)
|
||||
msg = 'At step %s:' % (i + 1)
|
||||
test.assert_(name == taken_path[i][1], msg + '\n' + error)
|
||||
|
||||
test.assert_(expected == taken, error)
|
||||
|
||||
|
||||
class WorkflowTest(unittest.TestCase):
|
||||
'''
|
||||
WARNING: Make sure to keep this test in sync with XmlReaderTest! Any
|
||||
change will break both tests!
|
||||
'''
|
||||
def setUp(self):
|
||||
self.wf = Workflow()
|
||||
self.expected_path = [( 1, 'Start'),
|
||||
( 2, 'task_a1'),
|
||||
( 3, 'task_a2'),
|
||||
( 2, 'task_b1'),
|
||||
( 3, 'task_b2'),
|
||||
( 4, 'synch_1'),
|
||||
( 5, 'excl_choice_1'),
|
||||
( 6, 'task_c1'),
|
||||
( 7, 'excl_choice_2'),
|
||||
( 8, 'task_d3'),
|
||||
( 9, 'multi_choice_1'),
|
||||
(10, 'task_e1'),
|
||||
(10, 'task_e3'),
|
||||
(11, 'struct_synch_merge_1'),
|
||||
(12, 'task_f1'),
|
||||
(13, 'struct_discriminator_1'),
|
||||
(14, 'excl_choice_3'),
|
||||
(15, 'excl_choice_1'),
|
||||
(16, 'task_c1'),
|
||||
(17, 'excl_choice_2'),
|
||||
(18, 'task_d3'),
|
||||
(19, 'multi_choice_1'),
|
||||
(20, 'task_e1'),
|
||||
(20, 'task_e3'),
|
||||
(21, 'struct_synch_merge_1'),
|
||||
(22, 'task_f1'),
|
||||
(23, 'struct_discriminator_1'),
|
||||
(24, 'excl_choice_3'),
|
||||
(25, 'multi_instance_1'),
|
||||
(26, 'task_g1'),
|
||||
(26, 'task_g2'),
|
||||
(26, 'task_g1'),
|
||||
(26, 'task_g2'),
|
||||
(26, 'task_g1'),
|
||||
(26, 'task_g2'),
|
||||
(27, 'struct_synch_merge_2'),
|
||||
(28, 'last'),
|
||||
(29, 'End'),
|
||||
(22, 'task_f2'),
|
||||
(22, 'task_f3'),
|
||||
(12, 'task_f2'),
|
||||
(12, 'task_f3')]
|
||||
|
||||
|
||||
def testWorkflow(self):
|
||||
# Build one branch.
|
||||
a1 = Task(self.wf, 'task_a1')
|
||||
self.wf.start.connect(a1)
|
||||
|
||||
a2 = Task(self.wf, 'task_a2')
|
||||
a1.connect(a2)
|
||||
|
||||
# Build another branch.
|
||||
b1 = Task(self.wf, 'task_b1')
|
||||
self.wf.start.connect(b1)
|
||||
|
||||
b2 = Task(self.wf, 'task_b2')
|
||||
b1.connect(b2)
|
||||
|
||||
# Merge both branches (synchronized).
|
||||
synch_1 = Join(self.wf, 'synch_1')
|
||||
a2.connect(synch_1)
|
||||
b2.connect(synch_1)
|
||||
|
||||
# If-condition that does not match.
|
||||
excl_choice_1 = ExclusiveChoice(self.wf, 'excl_choice_1')
|
||||
synch_1.connect(excl_choice_1)
|
||||
|
||||
c1 = Task(self.wf, 'task_c1')
|
||||
excl_choice_1.connect(c1)
|
||||
|
||||
c2 = Task(self.wf, 'task_c2')
|
||||
cond = Equal(Attrib('test_attribute1'), Attrib('test_attribute2'))
|
||||
excl_choice_1.connect_if(cond, c2)
|
||||
|
||||
c3 = Task(self.wf, 'task_c3')
|
||||
excl_choice_1.connect_if(cond, c3)
|
||||
|
||||
# If-condition that matches.
|
||||
excl_choice_2 = ExclusiveChoice(self.wf, 'excl_choice_2')
|
||||
c1.connect(excl_choice_2)
|
||||
c2.connect(excl_choice_2)
|
||||
c3.connect(excl_choice_2)
|
||||
|
||||
d1 = Task(self.wf, 'task_d1')
|
||||
excl_choice_2.connect(d1)
|
||||
|
||||
d2 = Task(self.wf, 'task_d2')
|
||||
excl_choice_2.connect_if(cond, d2)
|
||||
|
||||
d3 = Task(self.wf, 'task_d3')
|
||||
cond = Equal(Attrib('test_attribute1'), Attrib('test_attribute1'))
|
||||
excl_choice_2.connect_if(cond, d3)
|
||||
|
||||
# If-condition that does not match.
|
||||
multichoice = MultiChoice(self.wf, 'multi_choice_1')
|
||||
d1.connect(multichoice)
|
||||
d2.connect(multichoice)
|
||||
d3.connect(multichoice)
|
||||
|
||||
e1 = Task(self.wf, 'task_e1')
|
||||
multichoice.connect_if(cond, e1)
|
||||
|
||||
e2 = Task(self.wf, 'task_e2')
|
||||
cond = Equal(Attrib('test_attribute1'), Attrib('test_attribute2'))
|
||||
multichoice.connect_if(cond, e2)
|
||||
|
||||
e3 = Task(self.wf, 'task_e3')
|
||||
cond = Equal(Attrib('test_attribute2'), Attrib('test_attribute2'))
|
||||
multichoice.connect_if(cond, e3)
|
||||
|
||||
# StructuredSynchronizingMerge
|
||||
syncmerge = Join(self.wf, 'struct_synch_merge_1', 'multi_choice_1')
|
||||
e1.connect(syncmerge)
|
||||
e2.connect(syncmerge)
|
||||
e3.connect(syncmerge)
|
||||
|
||||
# Implicit parallel split.
|
||||
f1 = Task(self.wf, 'task_f1')
|
||||
syncmerge.connect(f1)
|
||||
|
||||
f2 = Task(self.wf, 'task_f2')
|
||||
syncmerge.connect(f2)
|
||||
|
||||
f3 = Task(self.wf, 'task_f3')
|
||||
syncmerge.connect(f3)
|
||||
|
||||
# Discriminator
|
||||
discrim_1 = Join(self.wf,
|
||||
'struct_discriminator_1',
|
||||
'struct_synch_merge_1',
|
||||
threshold = 1)
|
||||
f1.connect(discrim_1)
|
||||
f2.connect(discrim_1)
|
||||
f3.connect(discrim_1)
|
||||
|
||||
# Loop back to the first exclusive choice.
|
||||
excl_choice_3 = ExclusiveChoice(self.wf, 'excl_choice_3')
|
||||
discrim_1.connect(excl_choice_3)
|
||||
cond = NotEqual(Attrib('excl_choice_3_reached'), Attrib('two'))
|
||||
excl_choice_3.connect_if(cond, excl_choice_1)
|
||||
|
||||
# Split into 3 branches, and implicitly split twice in addition.
|
||||
multi_instance_1 = MultiInstance(self.wf, 'multi_instance_1', times = 3)
|
||||
excl_choice_3.connect(multi_instance_1)
|
||||
|
||||
# Parallel tasks.
|
||||
g1 = Task(self.wf, 'task_g1')
|
||||
g2 = Task(self.wf, 'task_g2')
|
||||
multi_instance_1.connect(g1)
|
||||
multi_instance_1.connect(g2)
|
||||
|
||||
# StructuredSynchronizingMerge
|
||||
syncmerge2 = Join(self.wf, 'struct_synch_merge_2', 'multi_instance_1')
|
||||
g1.connect(syncmerge2)
|
||||
g2.connect(syncmerge2)
|
||||
|
||||
# Add a final task.
|
||||
last = Task(self.wf, 'last')
|
||||
syncmerge2.connect(last)
|
||||
|
||||
# Add another final task :-).
|
||||
end = Task(self.wf, 'End')
|
||||
last.connect(end)
|
||||
|
||||
self.runWorkflow(self.wf)
|
||||
|
||||
|
||||
def runWorkflow(self, wf):
|
||||
taken_path = {'reached': [],
|
||||
'completed': []}
|
||||
for name, task in wf.tasks.iteritems():
|
||||
task.signal_connect('reached', on_reached_cb, taken_path['reached'])
|
||||
task.signal_connect('completed', on_complete_cb, taken_path['completed'])
|
||||
|
||||
# Execute all tasks within the Job.
|
||||
job = Job(wf)
|
||||
self.assert_(not job.is_completed(), 'Job is complete before start')
|
||||
try:
|
||||
job.complete_all()
|
||||
except:
|
||||
job.dump()
|
||||
raise
|
||||
|
||||
self.assert_(job.is_completed(),
|
||||
'complete_all() returned, but job is not complete\n'
|
||||
+ job.task_tree.get_dump())
|
||||
#job.task_tree.dump()
|
||||
|
||||
assert_same_path(self, self.expected_path, taken_path['completed'])
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,52 @@
|
|||
import sys, unittest, re, os.path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', '..', 'src'))
|
||||
|
||||
def suite():
|
||||
tests = ['testParseString', 'testParseFile', 'testRunWorkflow']
|
||||
return unittest.TestSuite(map(XmlReaderTest, tests))
|
||||
|
||||
from WorkflowTest import WorkflowTest
|
||||
from SpiffWorkflow.Storage import XmlReader
|
||||
from xml.parsers.expat import ExpatError
|
||||
|
||||
class XmlReaderTest(WorkflowTest):
|
||||
def setUp(self):
|
||||
WorkflowTest.setUp(self)
|
||||
self.reader = XmlReader()
|
||||
|
||||
|
||||
def testParseString(self):
|
||||
self.assertRaises(ExpatError,
|
||||
self.reader.parse_string,
|
||||
'')
|
||||
self.reader.parse_string('<xml></xml>')
|
||||
|
||||
|
||||
def testParseFile(self):
|
||||
# File not found.
|
||||
self.assertRaises(IOError,
|
||||
self.reader.parse_file,
|
||||
'foo')
|
||||
|
||||
# 0 byte sized file.
|
||||
file = os.path.join(os.path.dirname(__file__), 'xml', 'empty1.xml')
|
||||
self.assertRaises(ExpatError, self.reader.parse_file, file)
|
||||
|
||||
# File containing only "<xml></xml>".
|
||||
file = os.path.join(os.path.dirname(__file__), 'xml', 'empty2.xml')
|
||||
self.reader.parse_file(file)
|
||||
|
||||
# Read a complete workflow.
|
||||
file = os.path.join(os.path.dirname(__file__), 'xml', 'spiff', 'workflow1.xml')
|
||||
self.reader.parse_file(file)
|
||||
|
||||
|
||||
def testRunWorkflow(self):
|
||||
file = os.path.join(os.path.dirname(__file__), 'xml', 'spiff', 'workflow1.xml')
|
||||
workflow_list = self.reader.parse_file(file)
|
||||
for wf in workflow_list:
|
||||
self.runWorkflow(wf)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.TextTestRunner(verbosity = 2).run(suite())
|
|
@ -0,0 +1,29 @@
|
|||
#!/usr/bin/python
|
||||
import os, sys, unittest
|
||||
|
||||
modules = ['TaskInstanceTest',
|
||||
'OpenWfeXmlReaderTest',
|
||||
'PatternTest',
|
||||
'PersistenceTest',
|
||||
'WorkflowTest',
|
||||
'XmlReaderTest']
|
||||
|
||||
# Parse CLI options.
|
||||
if len(sys.argv) == 1:
|
||||
verbosity = 2
|
||||
elif len(sys.argv) == 2:
|
||||
verbosity = int(sys.argv[1])
|
||||
else:
|
||||
print 'Syntax:', sys.argv[0], '[verbosity]'
|
||||
print 'Default verbosity is 2'
|
||||
sys.exit(1)
|
||||
|
||||
# Load all test suites.
|
||||
all_suites = []
|
||||
for name in modules:
|
||||
module = __import__(name, globals(), locals(), [''])
|
||||
all_suites.append(module.suite())
|
||||
|
||||
# Run.
|
||||
suite = unittest.TestSuite(all_suites)
|
||||
unittest.TextTestRunner(verbosity = verbosity).run(suite)
|
|
@ -0,0 +1 @@
|
|||
<xml></xml>
|
|
@ -0,0 +1,27 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.6">
|
||||
<description>
|
||||
A test workflow that contains all possible tasks.
|
||||
</description>
|
||||
<concurrence name="concurrence_1">
|
||||
<sequence name="sequence_1">
|
||||
<task name="task_a1" />
|
||||
<task name="task_a2" />
|
||||
<if name="if_condition_1">
|
||||
<equals field-value="test_attribute1" other-value="test_attribute1" />
|
||||
<task name="task_a3" />
|
||||
</if>
|
||||
<if name="if_condition_2">
|
||||
<equals field-value="test_attribute1" other-value="test_attribute2" />
|
||||
<task name="task_a4" />
|
||||
<task name="task_a5" />
|
||||
</if>
|
||||
</sequence>
|
||||
<sequence name="sequence_2">
|
||||
<task name="task_b1" />
|
||||
<task name="task_b2" />
|
||||
</sequence>
|
||||
</concurrence>
|
||||
<task name="task_c1" />
|
||||
<task name="task_c2" />
|
||||
</process-definition>
|
|
@ -0,0 +1,49 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 37 (Acyclic Synchronizing Merge)</description>
|
||||
|
||||
<start-task>
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<task name="first">
|
||||
<successor>task_f1</successor>
|
||||
<successor>task_f2</successor>
|
||||
<successor>task_f3</successor>
|
||||
</task>
|
||||
|
||||
<!-- Implicit split. -->
|
||||
<task name="task_f1">
|
||||
<successor>join</successor>
|
||||
</task>
|
||||
<task name="task_f2">
|
||||
<successor>join</successor>
|
||||
</task>
|
||||
<task name="task_f3">
|
||||
<successor>excl_choice_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Choose a path to a stubtask. -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>task_g1</default-successor>
|
||||
<conditional-successor>
|
||||
<equals left-value="1" right-value="1" />
|
||||
<successor>task_g2</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<task name="task_g1">
|
||||
<successor>join</successor>
|
||||
</task>
|
||||
|
||||
<task name="task_g2">
|
||||
<successor>foo</successor>
|
||||
</task>
|
||||
<task name="foo" />
|
||||
|
||||
<!-- Structured synchronizing merge. -->
|
||||
<join name="join" context="first">
|
||||
<successor>end</successor>
|
||||
</join>
|
||||
</process-definition>
|
|
@ -0,0 +1,10 @@
|
|||
Start
|
||||
first
|
||||
task_f1
|
||||
task_f2
|
||||
task_f3
|
||||
excl_choice_1
|
||||
join
|
||||
End
|
||||
task_g2
|
||||
foo
|
|
@ -0,0 +1,38 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.6">
|
||||
<description>Pattern 10 (Arbitrary Cycles)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<pre-assign name="repeat" value="1" />
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Merge branches -->
|
||||
<task name="first">
|
||||
<successor>excl_choice_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Add an if-condition that matches once. -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>task_c1</default-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="repeat" right-value="1" />
|
||||
<successor>go_to_repetition</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<!-- Conditional tasks. -->
|
||||
<task name="go_to_repetition">
|
||||
<pre-assign name="repeat" value="0" />
|
||||
<successor>first</successor>
|
||||
</task>
|
||||
<task name="task_c1">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,9 @@
|
|||
Start
|
||||
first
|
||||
excl_choice_1
|
||||
go_to_repetition
|
||||
first
|
||||
excl_choice_1
|
||||
task_c1
|
||||
last
|
||||
End
|
|
@ -0,0 +1,45 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 28 (Blocking Discriminator)</description>
|
||||
|
||||
<start-task>
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<task name="first">
|
||||
<successor>task_f1</successor>
|
||||
<successor>task_f2</successor>
|
||||
<successor>task_f3</successor>
|
||||
</task>
|
||||
|
||||
<!-- Implicit split. -->
|
||||
<task name="task_f1">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
<task name="task_f2">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
<task name="task_f3">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Structured discriminator. -->
|
||||
<join name="struct_discriminator_1" context="first" threshold="1">
|
||||
<successor>excl_choice_1</successor>
|
||||
</join>
|
||||
|
||||
<!-- Loop back to the start (once). -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>last</default-successor>
|
||||
<conditional-successor>
|
||||
<not-equals left-field="excl_choice_1_reached" right-value="2" />
|
||||
<successor>first</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,15 @@
|
|||
Start
|
||||
first
|
||||
task_f1
|
||||
struct_discriminator_1
|
||||
excl_choice_1
|
||||
first
|
||||
task_f1
|
||||
struct_discriminator_1
|
||||
excl_choice_1
|
||||
last
|
||||
End
|
||||
task_f2
|
||||
task_f3
|
||||
task_f2
|
||||
task_f3
|
|
@ -0,0 +1,61 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 31 (Blocking Partial Join)</description>
|
||||
|
||||
<start-task>
|
||||
<successor>multi_choice_1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Split branches using a multi-choice. This creates 3 branches. -->
|
||||
<multi-choice name="multi_choice_1">
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute1" right-field="test_attribute1" />
|
||||
<successor>task_e1</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute1" right-field="test_attribute2" />
|
||||
<successor>task_e2</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute2" right-field="test_attribute2" />
|
||||
<successor>task_e3</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute2" right-field="test_attribute2" />
|
||||
<successor>task_e4</successor>
|
||||
</conditional-successor>
|
||||
</multi-choice>
|
||||
|
||||
<!-- Conditional branches. -->
|
||||
<task name="task_e1">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e2">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e3">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e4">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Structured synchronizing merge. -->
|
||||
<join name="struct_synch_merge_1" context="multi_choice_1" threshold="2">
|
||||
<successor>excl_choice_1</successor>
|
||||
</join>
|
||||
|
||||
<!-- Loop back to the start (once). -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>last</default-successor>
|
||||
<conditional-successor>
|
||||
<not-equals left-field="excl_choice_1_reached" right-value="2" />
|
||||
<successor>multi_choice_1</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,15 @@
|
|||
Start
|
||||
multi_choice_1
|
||||
task_e1
|
||||
task_e3
|
||||
struct_synch_merge_1
|
||||
excl_choice_1
|
||||
multi_choice_1
|
||||
task_e1
|
||||
task_e3
|
||||
struct_synch_merge_1
|
||||
excl_choice_1
|
||||
last
|
||||
End
|
||||
task_e4
|
||||
task_e4
|
|
@ -0,0 +1,41 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 20 (Cancel Job)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>one1</successor>
|
||||
<successor>two1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<task name="one1">
|
||||
<successor>one2</successor>
|
||||
</task>
|
||||
<task name="one2">
|
||||
<successor>cancel</successor>
|
||||
</task>
|
||||
<cancel-job name="cancel" />
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<task name="two1">
|
||||
<successor>two2a</successor>
|
||||
<successor>two2b</successor>
|
||||
</task>
|
||||
<task name="two2a">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two2b">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two3">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,4 @@
|
|||
Start
|
||||
one1
|
||||
one2
|
||||
cancel
|
|
@ -0,0 +1,59 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 15 (Multiple Instances without a priori Run-Time Knowledge)</description>
|
||||
|
||||
<start-task>
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<task name="first">
|
||||
<successor>add_instance_1</successor>
|
||||
<successor>multi_instance_1</successor>
|
||||
<successor>cancel_multi_instance_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<trigger name="add_instance_1" context="multi_instance_1">
|
||||
<successor>join</successor>
|
||||
</trigger>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<!-- Split into 2 branches, and implicitly split twice in addition. -->
|
||||
<multi-instance name="multi_instance_1" times="2">
|
||||
<successor>task_g1</successor>
|
||||
<successor>task_g2</successor>
|
||||
</multi-instance>
|
||||
|
||||
<!-- Parallel tasks. -->
|
||||
<task name="task_g1">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_g2">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Structured synchronizing merge. -->
|
||||
<join name="struct_synch_merge_1" context="multi_instance_1" threshold="8">
|
||||
<successor>join</successor>
|
||||
</join>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- Branch 3 -->
|
||||
<cancel-task name="cancel_multi_instance_1">
|
||||
<cancel>multi_instance_1</cancel>
|
||||
<successor>join</successor>
|
||||
</cancel-task>
|
||||
<!-- End branch 3 -->
|
||||
|
||||
<!-- Join all branches. -->
|
||||
<join name="join" context="first">
|
||||
<successor>last</successor>
|
||||
</join>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,14 @@
|
|||
Start
|
||||
first
|
||||
add_instance_1
|
||||
multi_instance_1
|
||||
task_g1
|
||||
task_g2
|
||||
task_g1
|
||||
task_g2
|
||||
task_g1
|
||||
task_g2
|
||||
cancel_multi_instance_1
|
||||
join
|
||||
last
|
||||
End
|
|
@ -0,0 +1,48 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 19 (Cancel Task)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>one1</successor>
|
||||
<successor>two1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<task name="one1">
|
||||
<successor>one2</successor>
|
||||
</task>
|
||||
<task name="one2">
|
||||
<successor>cancel</successor>
|
||||
</task>
|
||||
<cancel-task name="cancel">
|
||||
<cancel>two2a</cancel>
|
||||
<cancel>two4</cancel>
|
||||
<successor>last</successor>
|
||||
</cancel-task>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<task name="two1">
|
||||
<successor>two2a</successor>
|
||||
<successor>two2b</successor>
|
||||
</task>
|
||||
<task name="two2a">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two2b">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two3">
|
||||
<successor>two4</successor>
|
||||
</task>
|
||||
<task name="two4">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,9 @@
|
|||
Start
|
||||
one1
|
||||
one2
|
||||
cancel
|
||||
last
|
||||
End
|
||||
two1
|
||||
two2b
|
||||
two3
|
|
@ -0,0 +1,44 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 25 (Cancel Region)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>one1</successor>
|
||||
<successor>two1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<task name="one1">
|
||||
<successor>one2</successor>
|
||||
</task>
|
||||
<task name="one2">
|
||||
<successor>cancel</successor>
|
||||
</task>
|
||||
<cancel-task name="cancel">
|
||||
<cancel>two2a</cancel>
|
||||
<successor>last</successor>
|
||||
</cancel-task>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<task name="two1">
|
||||
<successor>two2a</successor>
|
||||
<successor>two2b</successor>
|
||||
</task>
|
||||
<task name="two2a">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two2b">
|
||||
<successor>two3</successor>
|
||||
</task>
|
||||
<task name="two3">
|
||||
<successor>last</successor>
|
||||
</task>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,11 @@
|
|||
Start
|
||||
one1
|
||||
one2
|
||||
cancel
|
||||
last
|
||||
End
|
||||
two1
|
||||
two2b
|
||||
two3
|
||||
last
|
||||
End
|
|
@ -0,0 +1,45 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 29 (Cancelling Discriminator)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>first</successor>
|
||||
</start-task>
|
||||
|
||||
<task name="first">
|
||||
<successor>task_f1</successor>
|
||||
<successor>task_f2</successor>
|
||||
<successor>task_f3</successor>
|
||||
</task>
|
||||
|
||||
<!-- Implicit split. -->
|
||||
<task name="task_f1">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
<task name="task_f2">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
<task name="task_f3">
|
||||
<successor>struct_discriminator_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Structured discriminator. -->
|
||||
<join name="struct_discriminator_1" context="first" cancel="1" threshold="1">
|
||||
<successor>excl_choice_1</successor>
|
||||
</join>
|
||||
|
||||
<!-- Loop back to the start (once). -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>last</default-successor>
|
||||
<conditional-successor>
|
||||
<not-equals left-field="excl_choice_1_reached" right-value="2" />
|
||||
<successor>first</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,11 @@
|
|||
Start
|
||||
first
|
||||
task_f1
|
||||
struct_discriminator_1
|
||||
excl_choice_1
|
||||
first
|
||||
task_f1
|
||||
struct_discriminator_1
|
||||
excl_choice_1
|
||||
last
|
||||
End
|
|
@ -0,0 +1,61 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 32 (Cancelling Partial Join)</description>
|
||||
|
||||
<start-task>
|
||||
<successor>multi_choice_1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Split branches using a multi-choice. This creates 3 branches. -->
|
||||
<multi-choice name="multi_choice_1">
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute1" right-field="test_attribute1" />
|
||||
<successor>task_e1</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute1" right-field="test_attribute2" />
|
||||
<successor>task_e2</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute2" right-field="test_attribute2" />
|
||||
<successor>task_e3</successor>
|
||||
</conditional-successor>
|
||||
<conditional-successor>
|
||||
<equals left-field="test_attribute2" right-field="test_attribute2" />
|
||||
<successor>task_e4</successor>
|
||||
</conditional-successor>
|
||||
</multi-choice>
|
||||
|
||||
<!-- Conditional branches. -->
|
||||
<task name="task_e1">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e2">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e3">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
<task name="task_e4">
|
||||
<successor>struct_synch_merge_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Structured synchronizing merge. -->
|
||||
<join name="struct_synch_merge_1" context="multi_choice_1" threshold="2" cancel="1">
|
||||
<successor>excl_choice_1</successor>
|
||||
</join>
|
||||
|
||||
<!-- Loop back to the start (once). -->
|
||||
<exclusive-choice name="excl_choice_1">
|
||||
<default-successor>last</default-successor>
|
||||
<conditional-successor>
|
||||
<not-equals left-field="excl_choice_1_reached" right-value="2" />
|
||||
<successor>multi_choice_1</successor>
|
||||
</conditional-successor>
|
||||
</exclusive-choice>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,13 @@
|
|||
Start
|
||||
multi_choice_1
|
||||
task_e1
|
||||
task_e3
|
||||
struct_synch_merge_1
|
||||
excl_choice_1
|
||||
multi_choice_1
|
||||
task_e1
|
||||
task_e3
|
||||
struct_synch_merge_1
|
||||
excl_choice_1
|
||||
last
|
||||
End
|
|
@ -0,0 +1,47 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 35 (Cancelling Partial Join for Multiple Instances)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>add_instance_1</successor>
|
||||
<successor>multi_instance_1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<!-- Split into 2 branches, and implicitly split twice in addition. -->
|
||||
<multi-instance name="multi_instance_1" times="2">
|
||||
<successor>task_g1</successor>
|
||||
<successor>task_g2</successor>
|
||||
</multi-instance>
|
||||
|
||||
<!-- Parallel tasks. -->
|
||||
<task name="task_g1">
|
||||
<successor>join_1</successor>
|
||||
</task>
|
||||
<task name="task_g2">
|
||||
<successor>join_1</successor>
|
||||
</task>
|
||||
|
||||
<!-- Merge instances. -->
|
||||
<join name="join_1" context="multi_instance_1" threshold="4" cancel="1">
|
||||
<successor>join_2</successor>
|
||||
</join>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<trigger name="add_instance_1" context="multi_instance_1">
|
||||
<successor>join_2</successor>
|
||||
</trigger>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Join all branches. -->
|
||||
<join name="join_2">
|
||||
<successor>last</successor>
|
||||
</join>
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,11 @@
|
|||
Start
|
||||
add_instance_1
|
||||
multi_instance_1
|
||||
task_g1
|
||||
task_g2
|
||||
task_g1
|
||||
task_g2
|
||||
join_1
|
||||
join_2
|
||||
last
|
||||
End
|
|
@ -0,0 +1,38 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 27 (Complete Multiple Instance Task)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>multi_instance_1</successor>
|
||||
<successor>trigger_join</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<!-- Create 3 branches, such that the join can never complete. -->
|
||||
<multi-instance name="multi_instance_1" times="3">
|
||||
<successor>task_g1</successor>
|
||||
</multi-instance>
|
||||
|
||||
<task name="task_g1">
|
||||
<successor>join_1</successor>
|
||||
</task>
|
||||
|
||||
<join name="join_1" context="multi_instance_1" threshold="5">
|
||||
<successor>join_2</successor>
|
||||
</join>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<trigger name="trigger_join" context="join_1">
|
||||
<successor>join_2</successor>
|
||||
</trigger>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<join name="join_2">
|
||||
<successor>last</successor>
|
||||
</join>
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
|
@ -0,0 +1,10 @@
|
|||
Start
|
||||
multi_instance_1
|
||||
task_g1
|
||||
task_g1
|
||||
task_g1
|
||||
trigger_join
|
||||
join_1
|
||||
join_2
|
||||
last
|
||||
End
|
|
@ -0,0 +1,50 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<process-definition name="flow" revision="1.0">
|
||||
<description>Pattern 39 (Critical Section)</description>
|
||||
|
||||
<!-- Start with an implicit simple split. -->
|
||||
<start-task>
|
||||
<successor>one_1</successor>
|
||||
<successor>two_1</successor>
|
||||
</start-task>
|
||||
|
||||
<!-- Branch 1 -->
|
||||
<task name="one_1">
|
||||
<lock>lock_one</lock>
|
||||
<successor>one_2</successor>
|
||||
</task>
|
||||
<acquire-mutex name="one_2" mutex="my_global_mutex">
|
||||
<lock>lock_two</lock>
|
||||
<successor>one_3</successor>
|
||||
</acquire-mutex>
|
||||
<task name="one_3">
|
||||
<lock>lock_three</lock>
|
||||
<successor>two_2</successor>
|
||||
<successor>one_4</successor>
|
||||
</task>
|
||||
<release-mutex name="one_4" mutex="my_global_mutex">
|
||||
<lock>lock_four</lock>
|
||||
<successor>last</successor>
|
||||
</release-mutex>
|
||||
<!-- End branch 1 -->
|
||||
|
||||
<!-- Branch 2 -->
|
||||
<task name="two_1">
|
||||
<successor>two_2</successor>
|
||||
</task>
|
||||
<acquire-mutex name="two_2" mutex="my_global_mutex">
|
||||
<successor>two_3</successor>
|
||||
</acquire-mutex>
|
||||
<task name="two_3">
|
||||
<successor>two_4</successor>
|
||||
</task>
|
||||
<release-mutex name="two_4" mutex="my_global_mutex">
|
||||
<successor>last</successor>
|
||||
</release-mutex>
|
||||
<!-- End branch 2 -->
|
||||
|
||||
<!-- A final task. -->
|
||||
<task name="last">
|
||||
<successor>end</successor>
|
||||
</task>
|
||||
</process-definition>
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue