We’ve nearly completed a legacy platform conversion project — by this I mean creating a new application using K2 to replace an older one.
One of the final steps in completing this project will be importing the legacy data into a relational database, and to update each record with data provided by a workflow process instance.
My challenge was to find an efficient way to run all of these records through the workflow.
Since we were talking about one-time conversions, I starting thinking about how I can “cheaply” invoke the K2 workflow API from something like a desktop application — something I can throw together to get the job done. I thought about building a .NET C# app to handle the job, but then I found that building a PowerShell script took almost no time at all.
I built the importer using two scripts: one to read each record and pass a formatted command to the other script which creates the K2 process instance.
The first script depends on the data being in a comma-delimited format inside of a text file — that’s as easy as running a query in SQL Server Management Studio and right-clicking on the result set to export the data. In the example below, the data is in
test-data.txt. The file contains records organized into three columns, with column headers at the top of the file, like this:
You get the idea. Here’s the code:
$DB = import-csv test-data.txt
foreach ($Data in $DB)
$First = $Data.First_Column
$Second = $Data.Second_Column
$Third = $Data.Third_Column
. . .
The code is creating variables out of the data in each of the columns. Notice how the column names are specified, e.g.,
The rest of the loop is a call to the second script, placing the variables
in order of the parameters the script expects:
. . .
.\init.ps1 -first_param $First -second_param $Second -third_param $Third
The second script does all the heavy lifting: it opens the connection, creates the process instance, adds the data into the data fields, kicks off the workflow and closes the connection:
### 1. create some vars
#-- data (as parameters)
[Parameter(Mandatory=$True)] [string] $first_param,
[Parameter(Mandatory=$True)] [string] $second_param,
[Parameter(Mandatory=$True)] [string] $third_param
#-- process instance data - not parameters
$k2processName = "MyWorkflows\WorkflowName"
$k2folioName = $first_param + "_ps1"
k2processName variable with the actual name of your process. If it’s inside of a folder, include the folder name separated by a backslash as pictured above.
I’m adding “
_ps1” to the first parameter data to create my folio name. I’m doing this so I can tell the difference between data already in the system from data I’m introducing via this method. Flavor to taste.
### 2. add the assembly
Add-Type -AssemblyName ("SourceCode.Workflow.Client, Version=18.104.22.168, Culture=neutral, PublicKeyToken=16a2c5aaaa1b130d")
### 3. create the connection using the assembly
$conn = New-Object -TypeName SourceCode.Workflow.Client.Connection
### 4. open the connection
All of these steps are fixed — use these as they appear. Notice I’m opening the connection on “localhost” — that’s because the scripts run on the K2 server.
### 5. create the process instance
$pi = $conn.CreateProcessInstance($k2processName)
$pi.Folio = $k2folioName
### 6. add our data into the datafields
$pi.DataFields["Data Field 1"].Value = $first_param
$pi.DataFields["Data Field 2"].Value = $second_param
$pi.DataFields["Data Field 3"].Value = $third_param
### 7. start the process instance
### 8. tidy up
As you may have guessed, use the actual names of your data fields under step 6.
That’s it! Just like Alanis told you — eight easy steps.
You’ve got a text-file with comma-delimited data, and two script files: one that parses the data and one that instantiates the workflow.
I don’t yet know how much abuse those scripts can take — whether they’d be able to process 100,000 records, how long, and so forth. But I’ll follow up once I find out!