Release #20 ready for download

It is done! The ’20 release of the coolOrange products is finally ready for download. Like the previous releases, this version supports Autodesk products in version 2020, 2019, 2018. So, both new and existing customers can move to the latest version of coolOrange products. The new version contains all the enhancements which have been done over the past year, and also introduces a brand-new licensing engine.

Why a new licensing engine? Autodesk offers a token flex license model and customers who use this model, require from us a license model that supports the Autodesk’s token model. So, the new licensing engine supports a token-based licensing model. We will provide more details about this very soon. What is important for now, is that the new licensing engine support network activation, so you can activate all your coolOrange clients via network. More details can be found on our wiki in the licensing section.

Brand new is also the jobWatcher, which is an extension for the Vault client and informs users when a job is queued, and when one of their jobs runs into an error. This is very helpful for those customers who rely on jobs and want to be informed in case of issues.

Brand new is also the Vault Fusion Lifecycle Connector, which, as the name says, connects your Vault with Fusion Lifecycle. This connector works out of the box and provides 3 scenarios, which are very simple to configure. With this connector, you can sync your projects, items, BOMs and manage change orders. We will talk more about this in the next weeks.

All the products can be downloaded from, except for the bcpToolkit, which will be released in the upcoming weeks and will support the Vault 2020 BCP format.

As usual, the new versions are compatible with the previous versions, which makes upgrading very easy. In case you need assistance, feel free to reach out to our support at


Posted in Uncategorized | Leave a comment

Analysing the PSP database for migration to Vault

There are still a lot of Autodesk Productstream Professional (PSP) environments that need to be migrated to Autodesk Vault. For analyzing the quality of the files, the Autodesk Data Export Utility (DEX) is covering a lot of topics, like identifying unmanaged files or missing references for Inventor files.
But how to analyze the quantity and quality of the meta-data? In our projects, we are often asked to give an estimate what the effort is to migrate from PSP to Vault, and the information that we get is just the number of files and size of the file store. This determines the duration of the import, but not the effort to prepare the import. Therefore, we have created SQL- and C#-scripts to get an overview of the used data and to identify database customizations. These scripts can be run with a free tool called LINQPad.
There is a SQL script BasisAnalyse.linq that analyses the PSP database and shows e.g. additional tables, element types, relationship types and whether replication or characteristics are used. This is important as the DEX is only covering the standard of PSP and there might be additional effort to cover these objects. The script gives also an overview of how often entity types, document types, usernames, status keys, and fields are used. This helps to identify which objects have really been used and make sense to transfer them.
The script DocumentRevisionCheck.linq identifies documents that could not be imported as they will have a date conflict when the import is done, as Vault is very restrictive in that point. The script ItemRevisionCheck.linq does the same for items. So, these conflicts can be identified at the beginning of the project and not in the late stage, when the first test import already has been made.
With LINQPad it is easy to export the analysis result in a Word or Excel file. These reports then can be discussed with the customer. They are a good basis for defining the scope of the transfer and therefore to estimate the effort. The scripts are for free and can be enhanced by you to your needs. But of course, coolOrange cannot give any warranty for these scripts and you use them with your own responsibility.
The scripts can be downloaded from the coolOrange Labs page There you will also find further instructions.

good luck with your PSP to Vault migrations

Posted in Migration | Leave a comment

Beta ’20 ready! What’s next?


Last week we published the beta versions of our products for Vault 2020. If you are a “new stuff junky” like us, then go on and download powerVault, powerJobs, powerEvents, powerGate, dataLoader, vaultRuler and jobWatcher. We look forward to your feedback, which you can post on our support forum, either in the category “product support” for any issue, or “feature request” you have further needs.

Brand new is the introduction of jobWatcher. It’s a Vault client extension, which notifies users when a job gets into the queue and in case the job runs into an error. For those who queue jobs silently (for instance on lifecycle transition) and the job is part of a process, being notified when a job fails is quite helpful. Let’s face it, users (and admins) don’t look into the job-queue to see if their jobs run into an error. So, being notified in case of error helps taking action right away. jobWatcher requires Windows 10, as it uses the Windows notification center for sending the notifications. Have a look and let us know what you think!!!

As usual, the new releases run with the latest version of Autodesk products, and also 2 versions back. So, the ’20 version will run with Vault 2020, 2019, 2018 officially. As of now it also run with Vault 2017, although this version will run out of support in April, as soon Autodesk officially releases the 2020 versions. This way, if you are still on an older version of Vault, you can still use the latest version of our product, which includes all enhancements and fixes.

We want your feedback: the products have the same feature as the latest previous release, except that they are running with Vault 2020. On our forum Feature Requests, we have posted all the feature requests that we received over time through our support, and now we need your voice for prioritizing those. Take a few minutes, go on the Feature Requests forum, and like the ideas that you find most relevant. Under each post, you’ll find the “do you like this idea?” link, which adds your like. You see this only as registered user, which is free. We want to enhance the products with the features you need most, so use your voice to tell us what you need. We will implement the most wanted features over time and may get in touch with you for asking about more details. So, take the chance to drive the direction of our/your products.

Posted in powerEvents, powerGate, powerJobs, powerVault | Leave a comment

Publish PDFs to Fusion Lifecycle

Few weeks ago, we published an article about using powerGate to connect Vault to Fusion Lifecycle. It looks like we hit an interesting topic, so today I’d like to show you how to publish your Vault CAD files to Fusion Lifecycle as PDF files. For this purpose, we will use powerJobs and powerGate.

This short animation shows you how this works. You’ll see the drawing gets released in Vault, the Job Processor with powerJobs picks up the task and finally, we see in Fusion Lifecycle, that a new item has been created with the part number and title of our drawing and the PDF of the drawing have been attached to the Fusion Lifecycle item.

In order to get this done, we need two things: a powerGate plugin for Fusion Lifecycle, and a job for powerJobs. At the end of this article, you will find the links to both.

We worked on the Fusion Lifecycle plugin for powerGate and implemented support for the V3 API. We also introduced the 3-legged Forge authentication, so that you don’t have to save your username and password in the powerJobs script anymore. Instead, we have a little login window, which has to be started one time on the machine where you have powerGate Server running. The dialog will ask you to login with your Autodesk ID and then to provide access to the “powerGate FLC” app. Doing so, powerGate can now talk on your behalf with Fusion Lifecycle. By connecting, we received a so-called refresh token, which helps for reconnect on restart. The token has a lifespan for 14 days and refreshed on each use. This way, you just have to authorizes powerGate server once, and he handles the reconnect.

The job is pretty simple. It’s the standard PDF job, except that it creates a Fusion Lifecycle item and then uploads the PDF to such item. You just have to provide a workspace ID (at the top of the script) in which the item shall be created. In order to execute the job, just add a custom job typeto your preferred lifecycle transition. Next time you change the state, the job will be queued, and you will find the PDF in Fusion Lifecycle – that simple!

If you like to try this out yourself, you need powerGate server and client, powerJobs, and the powerGate Server plugin for Fusion Lifecycle.

  1. Install the powerGate Server plugin: The plugin it’s a ZIP file, and the content shall be copied into the powerGate Server plugin folder under C:\ProgramData\coolOrange\powerGateServer\Plugins. So, you will have then a folder called C:\ProgramData\coolOrange\powerGateServer\Plugins\FusionLifecycle which contains several DLLs and other stuff.
  2. Login to Forge: Before you start the powerGate server, you need to grant permission to powerGate to connect on your behalf to Fusion Lifecycle. In the plugin folder, you’ll find a sub folder called Login Manager (C:\ProgramData\coolOrange\powerGateServer\Plugins\FusionLifecycle\LoginManager). Start the ForgeLoginManager.exe and follow the instructions. This is how it could look like:
  3. ReStart powerGate Server: Once this operation is completed, you can start/restart the powerGate Server. You should find an icon in your tray (right click, stop, start) or look for the powerGate System trayin your start menu.
  4. Install the powerJobs job: take the Job from the GitHub page and save it to the powerJobs Jobs folder C:\ProgramData\coolOrange\powerJobs\Jobs.
  5. Configure the job on your preferred lifecycle transition: go in to the Vault Settings and add the name of the job (PublishPdfAndUploadToFlc) to the transition. Next time a file runs through this transition, the job will be queued.

There are two little things that you have now to configure: the name of your Fusion Lifecycle tenant, and the workspace in which you like to create the items and upload the PDF.

  1. Configure the FL tenant: in the folder C:\ProgramData\coolOrange\powerGateServer\Plugins\fusionLifecycle you’ll find the file fusionLifecycle.dll.config. Open it with a text editor. At the very end, you’ll find the entry <add key=”FlcTenant” value=”coolorange”/>. Change the valueto the name of your tenant, save the file and restart the powerGate server.
  2. Configure the workspace you like to use: open the powerJobs job located under C:\ProgramData\coolOrange\powerJobs\Jobs with an editor (i.e. the PowerShell ISE). At the top, you’ll find the variable $powerGateServerwhich is the computer name of your powerGate Server. If all is installed locally, then localhostwill do.
    Then you’ll find a variable called $WorkspaceId. Set the workspace ID of your choice.

Now it’s time to try it out. Go in Vault, change the state of an Inventor drawing, let powerJobs run, and see whether you have the item and PDF in Fusion Lifecycle.

This is just the beginning. We will provide more stuff, like interacting with Vault events, or integrate with Data Standard, in the next weeks. So, have fun and stay tuned!










Posted in Forge, Fusion Lifecycle, powerGate, powerJobs | Leave a comment

bcpToolkit #19

Screen Shot 2018-07-20 at 16.32.32With this year, we started a LOAD initiative. On our new website, launched recently with the release of the 2019 products, LOAD has become one of our 3 pillars (LOAD-ENHANCE-CONNECT). We want to make loading data into Vault much easier, predictable and repeatable.

Today, we proudly announce the release of our new bcpToolkit version 19, which supports Vault 2019 BCP format. The new bcpToolkit contains the bcpViewer (formerly bcpChecker), the bcpDevkit and new PowerShell Command-lets.

The bcpToolkit is the perfect toolkit when you have large and complex data loading tasks against Vault. For instance, you have a huge set of files on your file system and want to bring them into Vault. You could try with the regular tools like Autoloader or Inventor task scheduler, but you will quickly run into limitation. With the bcpToolkit you can create your custom loading tool (give a look to the bcpMaker), which applies the rules and behaviors that fits to your situation. For instance, you can import Inventor assemblies, even if the references are not 100% correct. With other tools, those files would remain outside of Vault. The references will not be fixed via the import, but you don’t have to leave anything behind. Or you may want to set a portion of your files into state released or obsolete. Or you have an additional data set, like an export from your ERP, which you like to combine you’re your files. With the bcpToolkit you can create a custom BCP package, which contains exactly the settings you want and import it into Vault in a faster and complete way.

Another example is the migration from a competitive or legacy system to Vault. In this case, you want to retain your history, and the data must match the target Vault configuration. With the bcpToolkit it’s possible to create a custom tool that pulls the data from your current system and brings them in shape for your new target Vault. Within the bcpToolkit you will find also tools like the bcpViewer, which allows you to preview the BCP import package before the effective import.

The bcpChecker is now bcpViewer, and focuses on letting you preview a BCP package before a 10+ hour import into Vault. You have a Vault like UI where you can navigate through your folders, view your files, revisions and iterations, references, items, BOMs, and so verify that everything is at the right place in the right shape. Opening a large BCP package just takes little time. By opening a BCP package, it will be translated into a local SQLite database. This is necessary in order to deal with very large datasets. Doing so, next time you open the same package, it’s even faster.

The new PowerShell command-lets, allows you to open (convert from XML to SQLite), export (SQLite to XML) and close a BCP package via command-line. This makes it possible to write a script that applies changes to your BCP package and you can run your script over and over again, without manual interaction. Here is a sample on how a script could look like: In this sample, a BCP package is loaded (transformed from XML to SQLite), then a user defined property gets renamed (for all files) to match the new target Vault and then exported again to BCP. This script can now be executed over and over with the given BCP package. The new command-lets will open new possibilities to quickly manipulate BCP packages.

The bcpDevikit gets now installed with the bcpToolkit. One setup – many tools inside. The setup installs the bcpDevkit into the GAC, so that when you start Visual Studio, you can right away reference the bcpDevkit and get started. Here is a walk though for creating your first custom BCP package

Bottom line, the new bcpToolkit is simple to install, it contains powerful tools for BCP creation and review, and the new command-lets will make it easy to manipulate BCP packages. Check it out on

Posted in Vault BCP | Leave a comment

Connecting Vault with Fusion Lifecycle

Screen Shot 2018-06-27 at 12.48.55

The short answer is YES, and later in this post we described how we connect Vault with Fusion Lifecycle. For the longer answer, there are some topics we need to cover. We believe that CAD data shall be managed locally in Vault, while enterprise-wide engineering processes shall be managed in Fusion Lifecycle. Presumably we could spend hours talking about where Vault ends and where Fusion Lifecycle starts, but we probably agree that CAD files shall be managed locally.

Both systems provide a clean programming interface. Vault has a SOAP (webservices) API, while Fusion Lifecycle has a REST API. Both are HTTP based languages. Fusion Lifecycle can be customized via Java Script, while Vault with .Net. Both Java Script and .Net can deal with SOAP and REST. So, could the two systems be connected directly? Basically yes, but there are a some “but”.

The first hurdle, is to bring the two systems together. Vault is on the local network, behind the firewall, while Fusion Lifecycle sits in the cloud. The connection from the local network to the cloud is simple. The port 80 (443 for HTTPS) is usually open for the internet connection. If you can open the Fusion Lifecycle page with your web browser, it’s all good. For the way back, so from Fusion Lifecyle to Vault, the access to the Vault Server (or IIS) must be open. Usually a port mapping is needed on the firewall. This is technically simple but involves risks. The entire Vault API is available on the internet.

That Vault API is very powerful and comprehensive. Fusion Lifecycle would have to consume the complex Vault API and considers Vault’s business logic, even for simple operations. In addition, the Vault API gets improved/enhanced with each Vault version, and is therefore subject to changes. There is some compatibility, but still. The Fusion Lifecycle API is easier, but you still have to consider the business logic. Fusion Lifecycle currently has 3 APIs: V1 (official), V2 (deprecated), and V3 (technical preview).

In addition, many CAD offices have restricted or no Internet access. Keep in mind, that Fusion Lifecycle may talk directly to the Vault server (sort of server-to-server), while for connecting Vault workflows to Fusion Lifecycle, it’s the Vault client that must talk to Fusion Lifecycle (sort of client-to-server). So, each Vault client would need internet access to Fusion Lifecycle.

For these reasons, we take a different approach. We put powerGate server in the middle, between the two systems. powerGate can be extended with plugins. We already have a Vault plugin that exposes the Vault API as a simple REST API. This API is way simpler and Vault version independent.

New is a powerGate server Fusion Lifecycle Plugin, which makes it simple to talk to  Fusion Lifecycle. So, now we have two simplified APIs, version independent, which allows a bidirectional communication. The powerGate server sits on the local network, and thus reachable by all vault clients. So, Vault clients do not need internet access. The powerGate server can be installed on any server and the preferred port (not 80 or similar) will be exposed to the outside. Only the powerGate server is now reachable from outside. Now, both Vault and Fusion Lifecycle can talk to each other via the powerGate server, without knowing anything from each other.

The current powerGate server plugin for Fusion Lifecycle can query the workspaces, display all items (elements) of a given workspace, read details of a specific item and create new items. It is also possible to upload files and attach them to an item. The Fusion Lifecycle API can do more and we will enhance the plugin in the coming weeks and months, but as of now, it already can to a lot. Since it is powerGate, the REST services can be consumed from PowerShell and .Net. This allows creating very cool workflows in Vault Data Standard, powerEvents and powerJobs. For example, it is possible to check during a Vault release cycle, whether the according Fusion Lifecyle item is released too. Or submit a job that creates a PDF file and loads it as an attachment to the given Fusion Lifecycle item. Or a change process started in Fusion Lifecycle creates a matching Change Order in Vault and links the appropriate Vault files and items.

Here’s an example of how the workspaces can be queried in PowerShell

$workspaces = Get-ERPObjects -EntitySet "FlWorkspaces"
And here you can retrieve all elements to a workspace
$items = Get-ERPObjects -EntitySet "FlItems" -Filter "WorkspaceId eq 93"

And so you can get information of an element

$item = Get-ERPObject -EntitySet "FlItems" -Keys @{WorkspaceId=93;Id=7660} -Expand @('Properties','Relations','Attachments')

As you can see in this last example, the properties, the links, and the attachments are also retrieved. If you are familiar with the Fusion Lifecycle API, then you know that the attachments are a separate API call. But we do not want to deal with such complexity on the Vault Client side. We simply want the desired data. The powerGate Server Plugin should take care of handling this and other complexities.

In order to create a Fusion Lifecycle item, the mandatory and/or editable properties must be set. Here’s an example

$properties = @()
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='NUMBER';Value='01-000-0001'}
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='TITLE';Value='coolOrange powerGate test'}
$newItem = New-ERPObject -EntityType "FusionLifecycleItem" -Properties @{WorkspaceId=8;Properties=$properties}
Add-ERPObject -EntitySet "FlItems" -Properties $newItem

Files can be uploaded with this line

Add-ERPMedia -EntitySet "FlFile" -File C:\temp\test.pdf -Properties @{WorkspaceId=93;Id=7660;FileName="test.pdf";Description="test upload from powerGate"}

As you can see, the path to the local file, the ID of the workspace and the item to which the file is to be uploaded are specified. The complexity of making a matching HTTP call, using cookies, content-type, converting the file to appropriate bytes, etc. is handled by the powerGate server plugin.

A sample PowerShell Script, the powerGate plugin and the source code can be downloaded here from GitHub. We hope you enjoy this new powerGate extension. We will post more about Vault-Fusion Lifecycle in the future.

Posted in Fusion Lifecycle, powerGate | Leave a comment

Vote for AU Las Vegas classes

Whether you’re going to AU Las Vegas or not, you can and should vote for the classes. Here is the link As you know, we are Data Management guys, so we would love if you could especially give a look to the Data Management and Product Lifecycle Management topics, but also filter by Vault or Fusion Lifecycle. Most of the content will be made available after AU, so voting for your preferred classes is worth also if you will not attend.

Despite the general invite to vote for classes, we are obviously advertising our own cause as well. So, in case you like to have coolOrange people on stage, then vote up our classes. This year, we submitted several classes on several topics, and here is the list:

  • What’s New in the Vault API with examples you can use
  • Custom Reporting in Vault 2019 – Dress up your Vault data to meet the world!
  • Connect Vault with ERP – items, bill of materials, files and more…
  • Connect Vault with Fusion Lifecycle – expand your engineering workflows
  • Enhance Vault workflows, with Vault client events
  • How to load data into your Vault – Data Transfer Utility

The first session is for coders, while the further sessions will have interesting insights for coders as well but will also provide samples and insights for CAD admin, IT admins and Vault resellers of course.

So, if you like these sessions, then please vote them up, but also give a look to the other available sessions and express your preference. The voting will close on July 13th, so you still have some weeks to go, but the best would be if you take 10 min. now, while you read this blog post.

Thanks for your engagement, and we will hopefully see you in Las Vegas!!

Posted in Uncategorized | Leave a comment