Rule your Vault


We just released a new version of the vaultRuler. A powerful tool that applies category rules to all the files in your Vault. The new version additionally allows you to force the lifecycle definition and property assignment for those files that are already in the right category. And you can also reinforce the rules for all files in your Vault.

You may have just moved from Vanilla Vault to Vault Workgroup or Professional, have a huge amount of files, all in category Base, and want to bring them into the appropriate category. Or you have already Vault Workgroup or Professional, you changed some behaviors and want to bring existing files into the right shape. In both cases the vaultRuler will be a huge help.


This new version allows you to apply the lifecycle definition and property assignment also to those files that are already in the right category. Or, you may apply the rules to all files, regardless of the category.

So, let’s suppose you’ve worked with Vault for a while and you realize you need a few more properties. No problem. Just add the properties in the Vault configuration, assign them to the category you think is appropriate and all new files will have such properties. But what happens to the existing files? Well, the properties will be added only if you eventually change their value.

The alternative is to use the vaultRuler. It goes over all of your files and brings them again into the defined rule.

The applications for vaultRuler are manifold. The use is always the same. Configure the Vault as it best suits you, and then let the vaultRuler bring all your files into the right settings. The different options of the vaultRuler will allow you to selectively apply the configured rules. Download the vaultRuler now and see for yourself.

Posted in Migration | Leave a comment

Tracing the Vault API


We talked about tracing the Vault API in several blog posts, and Doug Redmond also wrote a great post on how to use Fiddler in order to trace live which API calls Vault is currently doing. We spent some time on getting more familiar with Fiddler and came across the Fiddler Core. It’s a .Net assembly you can use in your Visual Studio projects and filter the HTTP/HTTPS traffic the way you want. The outcome of our investigation is vapiTrace.

vapiTrace (Vault API trace), is a little application that listens to the HTTP traffic, like Fiddler, and filters just the Vault calls, removing all the other HTTP noise. It picks the called API command with according arguments and response. The result is shown in a simple interface.

Just start vapiTrace, click around in your Vault, and you will see which API functions are called with which arguments and what the response from the Vault server is.

This will help you understand which commands are needed for a specific operation, in which sequence the commands should be called, how the arguments shall be filled, and more. It’s a great simple tool that helps you getting more familiar with the Vault API.

Here is an example. Let’s say you like to create a folder and link some files to your folder. So, start Vault and position to any folder. Start the vapiTrace, position to the parent folder where you like to create your new folder, create the folder in Vault and then drag&drop your file into the new folder.


You will notice that the vapiTrace traces a lot of different commands. This is normal, as Vault is doing several things. If you look to the command list, you will notice 2 commands:

  • AddFolderWithCategory
  • AddLink

By clicking on the according command in vapiTrace, you will notice the according arguments and the response from the server. Pretty cool, huh?

IMPORTANT!! In order to allow vapiTrace, and same is with Fiddler, to track the HTTP traffic, you have to login into Vault with a valid host name. Localhost does not work, it has to be the valid computer name or IP address.

You will also notice, that vapiTrace only tracks the real API calls made from the client to the server. So basically, you’ll see the calls made through the WebserviceManager, but not the calls made with the VDF.

I hope this tool will help you get more familiar with the Vault API, or maybe help you trace some problems with your customization, or the like.

vapiTrace is free and you can download the executable here. If you like to see the code, we will share it shortly on gitHub, so revisit this post from time to time. If you have suggestions for improvements, just leave a comment here or contact us directly (

Posted in Uncategorized | 1 Comment

Trigger jobs at specific time intervals

We looked for this feature for years. Queuing jobs on a regular time period. With powerJobs 2015, we introduce such capability. It’s now possible to place a job into the queue every day at 10 o’clock, twice a week, every 10 minutes, or whatever time interval might be more suiting you.

For each job you create, you can also configure a .settings file which contains the definition of the time frame where the job shall be queued, the priority and the message that will show up in the job-queue.

So let’s suppose you have a folder in which your ERP System places text files. Such text files contain document numbers. For such document numbers, the according files shall be copied into a specific folder that is accessible for ERP users.

The job looks quite simple (ScanFolderAndCopyFile.ps1):

$files = Get-ChildItem c:\Temp\erp -File
foreach ($file in $files) {
     $content = Get-Content $file.FullName
     foreach ($line in $content) {
          Get-VaultFile -Properties @{'File Name' = $line } -DownloadPath c:\Temp\ERP\download
     Remove-Item $file.FullName

It gets the files from the specific folder, in my case c:\temp\erp. It reads the content of the files in such folder and expects that the files just contain the desired file name. For each file in such folder, and for each line in each file, the according file will be searched in Vault and copied down to the specified location.

The settings file looks like this (ScanFolderAndCopyFile.settings):

    "TimeBased": "0 0/1 * 1/1 * ? *",
    // This is the name of the Vault you want to trigger the job
    // And this two parameters are optional and self explaining:     "Priority":10,
    "Description":"look for files to be copied"

The key thing in this file is the TimeBased property. The syntax is cron (unix), and for those who are not familiar with, here is a very handy website: Just enter the settings you like, and it generates the syntax which you can then use in this file.

That’s it! You have now a job that gets queued every minute, or depending on how you configured it, scans a folder for data and processes the data. Of course this sample is very simple, but I hope you get the potential behind it. Now it’s up to you to think of all the jobs that you could create with this capability!

Have fun!

Posted in Uncategorized | Leave a comment

Job-creation made simple

Let’s say you want to do email notification on lifecycle change in Vault. Here is how the email might look like: powerJobs email In order to achieve this, you want to extend the Vault JobProcessor with an email job. Have you ever tried to create a custom job for your Vault JobProcessor using the Vault API? Here is a short list of tasks you have to do:

  • Install the Vault SDK (software Development Kit)
  • start Visual Studio and create a DLL-Project
  • reference the Vault API assemblies from the SDK
  • Extend your class with the according JobProcessor interface
  • Implement the interface functions
  • Start writing your code (expects coding skills and familiarity with Vault API)
  • copy the DLL bundle into the Extension folder
  • configure the config file in order to inform the JobProcessor about the new job

The tasks above are not familiar to you? Oh, you are not a developer? But you want to do more with your JobProcessor, right? Well, you are not alone! Now, let’s have a look how this works with powerJobs:

  • create a file with the extension .ps1 in the powerJobs\Jobs folder
  • start coding using a much simpler scripting language

Can you see the difference? All right, if you are still with me, then let’s do a practical example. Your interest might be around PDF creation from your CAD files. There is a standard job for this coming with powerJobs, and there are tons of ways to tailor it. But for the purpose of this blog post, let’s see how to create email notifications with powerJobs.

This way we get notified by the JobProcessor, every time a file gets either released, rejected, reviewed, etc. And here is how it works:

Start by creating a new file with the extension .PS1 in the powerJobs\Jobs folder. The easiest way to get there is to double click on the “powerJobs configuration” shortcut on your desktop. Let’s call the file sendEmail.ps1. Just pay attention when you create the file that the extension is really ps1 and not .ps1.txt or similar. You may also see that the symbol for your new file is identical to the one of the other PS1- files in the same folder. Edit the file with any text-editor. Well, it’s better to use either the Windows PowerShell-ISE (Integrated Scripting Environment) or download free editors such as powerGUI or PowerShell Plus. This is the content of the script we are looking for:

$file = PrepareEnvironmentForFile #get the file object of the processed file

#region collect information on lifecycle transition
$lfcTransId = $powerJobs.Job.Params["LifeCycleTransitionId"] #get the lifecycle trasition ID for the lifecycle transition the job has been triggered for
$lfcStateTransition = $vault.LifeCycleService.GetLifeCycleStateTransitionsByIds(@($lfcTransId)) #get the lifecycle trasition definition for the lifecycle transition the job has been triggered for
$lfcStates = $vault.LifeCycleService.GetLifeCycleStatesByIds(@($lfcStateTransition[0].FromID,$lfcStateTransition[0].ToID))
$fromState = $lfcStates[0]
$toState = $lfcStates[1]
Add-Log -Text "file $($file.Name) transiting from '$($fromState.DispName)' to '$($toState.DispName)'"

#region collect information about the involved users
$allUsers = $vault.AdminService.GetAllUsers()
$user = $allUsers | Where-Object { $_.Name -eq $file.Originator }
$causer = $allUsers | Where-Object { $_.Name -eq $file."Created By" }
Add-Log -Text "file orginated by $($user.Name) and last modified by $($cause.Name)"

#region build hyperlink to Vault file
$serverUrl = New-Object  System.uri($vault.AdminService.Url)
$filePath = $file."Full Path"
$filePathUrl = $filePath.Replace("$","%24").Replace("/","%2f").Replace(" ","+")
$fileUrl = $serverUrl.Scheme+"://"+$serverUrl.Host+ "/AutodeskDM/Services/EntityDataCommandRequest.aspx?Vault=Vault&ObjectId=$($filePathUrl)&ObjectType=File&Command=Select"
Add-Log -Text "hyperlink: $fileUrl"

[]::WriteAllBytes("c:\Temp\image.bmp",$file.Thumbnail.Image) #save the file Thumbnail as picture so that it can be attached to the email

#region prepare and send the email
$emailSubject = "The file $($file.Name) trasitioned from '$($fromState.DispName)' to '$($toState.DispName)'"
$emailFrom = ""
$emailTo = $user.Email # ""
$emailBody  = "
Dear $($user.FirstName), the file <b>$($file.Name)</b> transitined to state <b>$($toState.DispName)</b>, as <b>$($causer.FirstName)</b> changed the state.

<img src="cid:image.bmp" alt="" />
<a href="$fileUrl">$($filePath)</a>

If you like to contact $($causer.FirstName) for further explaination, send him an via at $($causer.Email)

sincerely, your humble powerJobs servant
Add-Log -Text "Sending email to $emailTo, from $emailFrom, with Subject $emailSubject"
Send-MailMessage -From $emailFrom -To $emailTo -Subject $emailSubject -Body $emailBody -BodyAsHtml -Attachments @("c:\Temp\image.bmp") -SmtpServer ""

Add-Log -Text "Job completed"

Now, the really interesting line is #36, where the Send-MailMessage commandlet is called with simple parameters, that even non-developers can easily understand. The variables above (all starting with $) should also be quite clear. In order to send a message with some individual Vault-related content, such as file information or information around the lifecycle transition etc., we have to “talk” with the Vault server and gather some more details.

Now, the good news is that for certain typical activities, powerJobs provides simple to use commandlets. Here an example: we want to know the file name, who performed the last change and know in which folder the file is located. If we would do it with the Vault API, then the tasks would be:

  • getting the properties definitions
  • getting the properties for the given file
  • getting the folder where the file resides

This would be about 5 to 10 lines of code. With powerJobs, the $file variable, set at the top of the script file, already contains all system and user defined properties and also information about folder location. So, one line does it all for you. Quite handy, huh?

As we like to provide some good information in the email, we have to identify the lifecycle transition we are in. As powerJobs does not yet provide simple commandlets for this task, we will use the Vault API.  These are the lines:

#region collect information on lifecycle transition $lfcTransId = $powerJobs.Job.Params["LifeCycleTransitionId"] #get the lifecycle trasition ID for the lifecycle transition the job has been triggered for $lfcStateTransition = $vault.LifeCycleService.GetLifeCycleStateTransitionsByIds(@($lfcTransId)) #get the lifecycle trasition definition for the lifecycle transition the job has been triggered for $lfcStateTransition[0].FromID $lfcStates = $vault.LifeCycleService.GetLifeCycleStatesByIds(@($lfcStateTransition[0].FromID,$lfcStateTransition[0].ToID)) $fromState = $lfcStates[0] $toState = $lfcStates[1] Add-Log -Text "file $($file.Name) transiting from '$($fromState.DispName)' to '$($toState.DispName)'" #endregion

Additionally, we like to add a hyperlink in the email body that leads the user directly to the related file inside Vault. Such hyperlink must be dynamically created, and these are the according lines:

#region build hyperlink to Vault file
$serverUrl = New-Object  System.uri($vault.AdminService.Url)
$filePath = $file."Full Path"
$filePathUrl = $filePath.Replace("$","%24").Replace("/","%2f").Replace(" ","+")
$fileUrl = $serverUrl.Scheme+"://"+$serverUrl.Host+ "/AutodeskDM/Services/EntityDataCommandRequest.aspx?Vault=Vault&ObjectId=$($filePathUrl)&ObjectType=File&Command=Select"
Add-Log -Text "hyperlink: $fileUrl"

That’s it!

Now, in order to try it out, you have to configure the automatic job queuing for the lifecycle transitions you are looking for. This can be done via the LifecycleEventEditor provided by Autodesk. The LifecycleEventEditor is one of the many little secrets of Vault. This tool can be found in the Vault SDK (C:\Program Files (x86)\Autodesk\Autodesk Vault 2015 SDK\util\LifecycleEventEditor). The setup for the SDK can be found on the ADMS Server folder (C:\Program Files\Autodesk\ADMS Professional 2015\SDK) or in the Vault client (C:\Program Files\Autodesk\Vault Professional 2015\SDK). Run the setup and then start the LifecycleEventEditor. You will then select your preferred Lifecycle definition, lifecycle state and transition. Via the “Actions” menu you will be able to add the job to that transition and then commit the changes. The job name is simply the name of your PS1 file without the extension.


As you now transition your file from one state to another, then this might be the email that will show up in your inbox:

powerJobs email

Doesn’t it look cool??? So, now that you have the technique, you can extend this job to any other scenario.

Bottom line, with powerJobs it’s super simple to create a new job and thanks to the new commandlets, it’s also very simple to deal with the Vault API, even for unskilled developers!!! Have fun!!


P.s.: if you like to test the script and don’t have an SMTP server at your hand, you may want to use your google account. In order to do that, you have to pass to the Send-MailMessage some more arguments, such as your credentials and the according SMTP port for Gmail.

Send-MailMessage ....... -SmtpServer "" -UseSsl -Credential $cred -Port 587

You see that the last arguments are -UseSsl for secure connection, -Credential with the according variable containing you userID and password, and the -Port 587. Now, the $cred needs to contain you username and password and the command-let Get-Credential does this for you.

$cred = Get-Credential

It will ask you interactively for your username and password. This is great, however, you don’t want to enter such things every time the job gets executed and as the job gets executed silently you wouldn’t have the ability to enter such information. For this purpose, I’d suggest to execute the command one time within a running PowerShell and store such information into a XML file that can be reused within the job. Here is the code:

$cred = Get-Credential
$cred | Export-Clixml c:\temp\cred.xml 

Now your credentials are stored into a local file. This file can be read at every job like this:

$cred = Import-Clixml c:\temp\cred.xml
Send-MailMessage ....... -SmtpServer "" -UseSsl -Credential $cred -Port 587

Once this is set up, you still will get troubles from Gmail, as it does not allow to use the SMTP server from every application. In order to change such setting, login into your google account, then follow this link and finally set the “Access for less secure apps” in the “Account permissions” section to “Enabled”. Of course you should set it back once you’ve finished your testing/demo with this job.

So, now you should be able to use Gmail as your SMTP server.

Posted in powerJobs, PowerShell, Vault API | 2 Comments

powerJobs – A new UI experience

We recently released a new version of powerJobs 2015 and talked about on this post. In the past week, we run webcasts where we showed the new features in action. We recorded the whole session, so in case you missed it, you can watch it here: Screencast – What’s new in powerJobs 15.1?

Today, I’d like to highlight one feature in particular, which is the extended JobProcessor UI.

This is how the JobProcessor looks like when delivered by Autodesk:


And this is how the JobProcessor looks like when started with powerJobs:

powerJobs UI

Underneath the JobProcessor window, you’ll notice an additional window that shows messages. It’s a log or trace window. It shows what is currently happening inside the job. It gives you transparency on the current operations, whether they are successful or facing an issue.

You can also save the logs into a file and send it then to your admin, reseller or to us for further analysis.

Looks good, right? But how does coolOrange do this? Are they changing the Autodesk JobProcessor code? Can I still rely on that thing? Will it change the behavior? What about compatibility?
All good questions, and the answers are simple: don’t worry!

The powerJobs executable is independent from the JobProcessor. The powerJobs.exe either starts the JobProcessor, or accesses a running instance, and simply “takes the screen” of the JobProcessor and embeds it into the powerJobs UI. So, the 2 applications are still independent from each other, however, they look as they are one. This way, no matter which improvements will come with the JobProcessor, powerJobs will not interfere.

Of course, you can run the JobProcessor without the powerJobs UI, and still benefit from the jobs that are delivered with powerJobs, such as the PDF creation or your custom jobs created with PowerShell. However, you will not benefit from the transparency and logging of the trace-window.

So, how do these messages come on the screen? With powerJobs 2015 we released a set of commandlets. Simple commands you can use within your PowerShell scripts. We spoke about this few weeks ago in this post and will talk more in future posts. The commandlet responsible for the logging is the Add-Log. The syntax is super simple:

Add-Log –Text “Your text here…”

Now, you can configure how much information you like to see either in the log-window and/or in the log file. For instance, you can edit the coolOrange.powerJobs.CreatePdfAsAttachment.ps1 file and add some more log messages.

By default, in the log file you will have only warnings, errors, fatal exceptions but no info messages. The Add-Log produces info messages, so by default, such messages will not be reported in the log file. This is intentional, so that as long as there is nothing critical, the log-file should remain almost empty. In case of troubles, the log-file will contain the according information.

In case you like to see more, either in the log-window or in the log-file, you can change the standard settings. Under c:\ProgramData\Autodesk\Vault 2015\Extensions\coolOrange.PowerJobs.Handler you’ll find the file coolOrange.powerJobs.dll.log4net. Just open this file with a notepad. You will find 3 appender sections:

<appender name="OutputDebugStringAppender" ...
<appender name="rollingFile" ...
<appender name="MsgAppender" ...

We are looking for the rollingFile, which is responsible for the log-file, and the MsgAppender, which is responsible for the messages in the log-window. Each appender has a <filter …> where the <levelMin value=”….”> defines what the minimum level of information is that we like to get reported. Now by default, the level for the rollingFile is set to WARN (warnings) so that the INFO messages (more verbose but les less relevant) are not reported to the log file. If you like to have the INFO messages in your log-file as well, just change WARN to INFO, like this:

<filter type="log4net.Filter.LevelRangeFilter">
  <levelMin value="INFO" />
  <levelMax value="FATAL" />

In a similar way, you can increase the verbosity of the log-window by editing the levelMin for the MsgAppender filter. By default it is set to INFO, which means that your Add-Log messages and also WARN, ERROR, FATAL would be reported. So this level is pretty good. If you like to see more, you may change it to DEBUG, which would fill up your screen with tons of messages coming from the source code. So, just for fun you may do a test, but you probably will switch back to INFO. Remember to restart powerJobs when you change the log4net file.

So, this is one new feature of powerJobs 2015. Next time I will talk about how to easily create new custom jobs.

Posted in powerJobs, PowerShell | Leave a comment

Our free apps for Inventor – Part V – featureMigrator

The coolOrange featureMigrator allows Inventor users to create part features – like holes – from assembly features more easily.

For example: if you have 2 plates, you can create a hole through both plates on the assembly level. So you know that the hole fits perfectly. But when you open the plate, you would like to see the hole there as well.

Therefore you will migrate the hole into both plates with the featureMigrator.

Like the other coolOrange plug-ins you can download it from the Inventor App Store. The installation is pretty straight forward. Just run the .msi file and follow the instructions.

In the Tools menu of the ribbon in Inventor assemblies, you will find a new command Control.

This will open the featureMigrator-browser. All features are listed there and you can migrate assembly-features by selecting one or several of them. Then, use the Send to Parts command.

At this point, a copy of the original part is made and the feature is moved to this copy. After this step, the original part is replaced with this copy.

Once the operation of migrating features is completed, the featureMigrator will display a dialog that provides information about the results:

This first dialog is a “high-level” report of the migration. It provides the opportunity to select the action for all the assembly features migrated (Suppress if Succeeded, Suppress always, Delete if Succeeded, Delete always or Nothing), and also for the part features that haven’t been migrated correctly (Suppress, Delete or Nothing). Features created by the add-in can be in an invalid state in the parts for a number of reasons.


Associativity between part & assembly features

Once you have migrated the feature to the part, the featureMigrator still “remembers” where the feature came from. All the existing features resulting from a migration by the tool are listed in the featureMigrator-browser. If the original feature in the assembly has changed, you can update the feature in the part with the command Update from Assembly in the featureMigrator-browser.

And that’s it!

Have fun with the coolOrange featureMigrator!

Posted in Uncategorized | Leave a comment

Our free apps for Inventor – Part IV – linkParameters

linkParameters allows Inventor users to easily create dependencies between parameters in various parts and sub-assemblies in the context of the top-level assembly within which they reside.

Parameters can be visually selected from a source-component and linked to a specific parameter in a target-component creating the dependency. The mechanism employed is based on the iLogic functionality. It will automatically generate the iLogic code which is required to link the values of the parameters, without the need to manually write any iLogic instructions.

Like the other coolOrange plug-ins you can download it from the Inventor App Store and the installation is pretty straight forward. Just run the msi file and follow the instructions.

In the Manage menu of the ribbon in Inventor assemblies you will get a new command linkParameters in the iLogic section.

For example:  you want the diameter of a part to have the same value as the length of another part in your assembly. Select linkParameters and specify the name of the iLogic rule, which will be created in the background. This first dialog also provides the ability to select an existing rule if any.

Validating this dialog will display the main dialog of the tool:

Select the part with the length as source component and the part with the diameter as target component. Notice that the top-level assembly can also be used as source by checking the upper-left check-box. Drag the source parameter Length on the target parameter Diameter.

By right-clicking the target mappings, they can also be suppressed through a context-menu. An option to automatically map source and target parameters with the same name is also provided from this context menu.

Enjoy coolOrange linkParameters!

Posted in Uncategorized | Leave a comment