Server UI has changed since CloverDX 5.0
This section of the guide was built using CloverETL 4.1 which looks different from current versions.
As of version 5.0 released in October 2018, CloverETL has renamed to CloverDX and the Server module has changed significantly. Please bear with us as we’re updating this guide to reflect the UI of the new Server. However, you can still use this lesson as most of the concepts taught here have not changed, we just ask you to be patient when getting familiar with the new Server look & feel.
In this lesson you will
- Build a job that will wait for the arrival of transaction files
- Process each file with the graph developed in lesson 4
- Categorize and archive the processed files
- Keep a status log for each file
Prerequisites
• Access to CloverDX Server (Ask for trial)
What is a jobflow? 0:44
CloverDX jobflow lets you design a layer of logic above the record-level processing done by transformation graphs. In jobflows, execution of a graph is just one of several steps you can put in sequence or branches based on conditions. You can add file operations, choose dynamically which graphs to execute and track and record progress.
Example jobflow
In the example below, first, the jobflow gets a filename of a freshly arrived file (we’ll talk about how later), then arranges for the file contents to load into a database using a graph (Load to DB), then it moves the files to an archive folder and finally collects execution status results into a text file.
Detour: Introducing Parameters 3:17
Parameters let you define value placeholders instead of providing actual values. You can use parameters anywhere you want (any configuration attribute of a component, inside CTL code, etc.)
In our example, we’ll create a parameter and use it as File URL attribute of UniversalDataReader.
Once you have created the parameter INPUTFILE, you can reference it in any component attribute. In our case, we set
File URL to ${INPUTFILE}
Creating a Jobflow 4:23
However, you use CloverDX Designer to design and test on the Server.
Jobflows look similar to graphs but perform completely different tasks.
Jobflow steps are connected with edges, just like in transformation graphs, but these pass tokens rather than actual data records. A token represents the execution status of a particular step.
Getting runtime parameters (GetJobInput) 6:04
Many times, a graph or a jobflow will be called by some parent (either another jobflow or the Server itself).
In our case, we’ll be setting up a File Event Listener to invoke our jobflow later on.
A parent might want to pass some configuration information to its child using parameters. To do that, GetJobInput component in used by the child can read those parameters and pass them to the jobflow as a token.
Mapping in GetJobInput
We’re expecting the calling parent to pass EVENT_FILE_NAME parameter to our jobflow. Notice we’re using getParamValue() function to get the parameter value and then assign it to filename in the token.
Note that InputFileInfo is a token that we’ve created manually.
ExecuteGraph component 7:55
ExecuteGraph runs a transformation graph as a step in a jobflow.
InputMapping in ExecuteGraph
Remember when we created INPUTFILE parameter in our graph and made it public? Now it’s visible in ExecuteGraph and we can use the value from the incoming token to dynamically specify the file our graph should be reading.
OutputMapping in ExecuteGraph
ExecuteGraph provides information and statistics about each run on its output ports. It sends successful executions to its first output port (via Output Mapping) and failed executions to its second port (Error Mapping).
Jobflow components can pass tokens between steps. You can map fields from the incoming token onto the output of the current step.
Performing File Operations in Jobflows 9:20
You can Copy, Move, Delete, Create and List files in jobflows using dedicated file operation components.
There are two ways that you can set the source and target for a file operation:
Option 1: Explicitly set Source and Target file URLs
You can either simply specify Source and Target file URL directly or via a parameter.
However, this configuration makes the paths static for the whole run of the job.
Option 2: Dynamically set Source and Target file URLs
Using InputMapping and OutputMapping is another method of setting Source and Target URLs. This approach allows setting the paths dynamically with each token passing through the file component. This is good, for example, when the jobflow is looping over a list of files.
Input Mapping for file operations
Logging tokens to a file 11:30
This section demonstrates a very basic way of logging jobflow tokens into a file.
In order for the MoveFile (Achive Processed File) component to pass tokens to UniversalDataWriter (Update Execution Log) we need to set its Output Mapping.
To log not only the result of the move operation but also the result of ExecuteGraph (Load to DB) we want to add all fields of the ExecuteGraph_RunStatus token to the output token too.
The optional second output port of ExecuteGraph contains execution status tokens for failed graph runs.
The second MoveFiles (Archive Failed File) component takes results from the second (error) output port of ExecuteGraph. Since we’ve configured the first MoveFiles component to take attributes dynamically (see Option 2 in File operations in Jobflows) we can simply copy it using Ctrl+C, Ctrl+V and connect the edges.
In order for ExecuteGraph to produce error tokens on the second output port instead of failing the whole jobflow with an error, we need to set Error Mapping on Load to DB (ExecuteGraph).
One last thing: Now we have two branches (success and error), and we need to gather tokens from both into a single stream that we can then output to our log file.
Watching for new incoming files (File event listeners) 15:31
CloverDX can listen to events such as files appearing in a watched folder, listen to message queues, hook actions on job events or even create custom event triggers.
In our example we’ll configure a new file event listener to watch the folder C:\incoming.