bcpToolkit #19

Screen Shot 2018-07-20 at 16.32.32With this year, we started a LOAD initiative. On our new website, launched recently with the release of the 2019 products, LOAD has become one of our 3 pillars (LOAD-ENHANCE-CONNECT). We want to make loading data into Vault much easier, predictable and repeatable.

Today, we proudly announce the release of our new bcpToolkit version 19, which supports Vault 2019 BCP format. The new bcpToolkit contains the bcpViewer (formerly bcpChecker), the bcpDevkit and new PowerShell Command-lets.

The bcpToolkit is the perfect toolkit when you have large and complex data loading tasks against Vault. For instance, you have a huge set of files on your file system and want to bring them into Vault. You could try with the regular tools like Autoloader or Inventor task scheduler, but you will quickly run into limitation. With the bcpToolkit you can create your custom loading tool (give a look to the bcpMaker), which applies the rules and behaviors that fits to your situation. For instance, you can import Inventor assemblies, even if the references are not 100% correct. With other tools, those files would remain outside of Vault. The references will not be fixed via the import, but you don’t have to leave anything behind. Or you may want to set a portion of your files into state released or obsolete. Or you have an additional data set, like an export from your ERP, which you like to combine you’re your files. With the bcpToolkit you can create a custom BCP package, which contains exactly the settings you want and import it into Vault in a faster and complete way.

Another example is the migration from a competitive or legacy system to Vault. In this case, you want to retain your history, and the data must match the target Vault configuration. With the bcpToolkit it’s possible to create a custom tool that pulls the data from your current system and brings them in shape for your new target Vault. Within the bcpToolkit you will find also tools like the bcpViewer, which allows you to preview the BCP import package before the effective import.

The bcpChecker is now bcpViewer, and focuses on letting you preview a BCP package before a 10+ hour import into Vault. You have a Vault like UI where you can navigate through your folders, view your files, revisions and iterations, references, items, BOMs, and so verify that everything is at the right place in the right shape. Opening a large BCP package just takes little time. By opening a BCP package, it will be translated into a local SQLite database. This is necessary in order to deal with very large datasets. Doing so, next time you open the same package, it’s even faster.

The new PowerShell command-lets, allows you to open (convert from XML to SQLite), export (SQLite to XML) and close a BCP package via command-line. This makes it possible to write a script that applies changes to your BCP package and you can run your script over and over again, without manual interaction. Here is a sample on how a script could look like: https://support.coolorange.com/support/solutions/articles/22000228087-how-to-rename-properties-of-a-bcp-package. In this sample, a BCP package is loaded (transformed from XML to SQLite), then a user defined property gets renamed (for all files) to match the new target Vault and then exported again to BCP. This script can now be executed over and over with the given BCP package. The new command-lets will open new possibilities to quickly manipulate BCP packages.

The bcpDevikit gets now installed with the bcpToolkit. One setup – many tools inside. The setup installs the bcpDevkit into the GAC, so that when you start Visual Studio, you can right away reference the bcpDevkit and get started. Here is a walk though for creating your first custom BCP package http://www.coolorange.com/wiki/doku.php?id=bcptoolkit:getting_started:using_the_bcpdevkit.

Bottom line, the new bcpToolkit is simple to install, it contains powerful tools for BCP creation and review, and the new command-lets will make it easy to manipulate BCP packages. Check it out on http://www.coolorange.com.

Posted in Vault BCP | Leave a comment

Connecting Vault with Fusion Lifecycle

Screen Shot 2018-06-27 at 12.48.55

The short answer is YES, and later in this post we described how we connect Vault with Fusion Lifecycle. For the longer answer, there are some topics we need to cover. We believe that CAD data shall be managed locally in Vault, while enterprise-wide engineering processes shall be managed in Fusion Lifecycle. Presumably we could spend hours talking about where Vault ends and where Fusion Lifecycle starts, but we probably agree that CAD files shall be managed locally.

Both systems provide a clean programming interface. Vault has a SOAP (webservices) API, while Fusion Lifecycle has a REST API. Both are HTTP based languages. Fusion Lifecycle can be customized via Java Script, while Vault with .Net. Both Java Script and .Net can deal with SOAP and REST. So, could the two systems be connected directly? Basically yes, but there are a some “but”.

The first hurdle, is to bring the two systems together. Vault is on the local network, behind the firewall, while Fusion Lifecycle sits in the cloud. The connection from the local network to the cloud is simple. The port 80 (443 for HTTPS) is usually open for the internet connection. If you can open the Fusion Lifecycle page with your web browser, it’s all good. For the way back, so from Fusion Lifecyle to Vault, the access to the Vault Server (or IIS) must be open. Usually a port mapping is needed on the firewall. This is technically simple but involves risks. The entire Vault API is available on the internet.

That Vault API is very powerful and comprehensive. Fusion Lifecycle would have to consume the complex Vault API and considers Vault’s business logic, even for simple operations. In addition, the Vault API gets improved/enhanced with each Vault version, and is therefore subject to changes. There is some compatibility, but still. The Fusion Lifecycle API is easier, but you still have to consider the business logic. Fusion Lifecycle currently has 3 APIs: V1 (official), V2 (deprecated), and V3 (technical preview).

In addition, many CAD offices have restricted or no Internet access. Keep in mind, that Fusion Lifecycle may talk directly to the Vault server (sort of server-to-server), while for connecting Vault workflows to Fusion Lifecycle, it’s the Vault client that must talk to Fusion Lifecycle (sort of client-to-server). So, each Vault client would need internet access to Fusion Lifecycle.

For these reasons, we take a different approach. We put powerGate server in the middle, between the two systems. powerGate can be extended with plugins. We already have a Vault plugin that exposes the Vault API as a simple REST API. This API is way simpler and Vault version independent.

New is a powerGate server Fusion Lifecycle Plugin, which makes it simple to talk to  Fusion Lifecycle. So, now we have two simplified APIs, version independent, which allows a bidirectional communication. The powerGate server sits on the local network, and thus reachable by all vault clients. So, Vault clients do not need internet access. The powerGate server can be installed on any server and the preferred port (not 80 or similar) will be exposed to the outside. Only the powerGate server is now reachable from outside. Now, both Vault and Fusion Lifecycle can talk to each other via the powerGate server, without knowing anything from each other.

The current powerGate server plugin for Fusion Lifecycle can query the workspaces, display all items (elements) of a given workspace, read details of a specific item and create new items. It is also possible to upload files and attach them to an item. The Fusion Lifecycle API can do more and we will enhance the plugin in the coming weeks and months, but as of now, it already can to a lot. Since it is powerGate, the REST services can be consumed from PowerShell and .Net. This allows creating very cool workflows in Vault Data Standard, powerEvents and powerJobs. For example, it is possible to check during a Vault release cycle, whether the according Fusion Lifecyle item is released too. Or submit a job that creates a PDF file and loads it as an attachment to the given Fusion Lifecycle item. Or a change process started in Fusion Lifecycle creates a matching Change Order in Vault and links the appropriate Vault files and items.

Here’s an example of how the workspaces can be queried in PowerShell

$workspaces = Get-ERPObjects -EntitySet "FlWorkspaces"
And here you can retrieve all elements to a workspace
$items = Get-ERPObjects -EntitySet "FlItems" -Filter "WorkspaceId eq 93"

And so you can get information of an element

$item = Get-ERPObject -EntitySet "FlItems" -Keys @{WorkspaceId=93;Id=7660} -Expand @('Properties','Relations','Attachments')

As you can see in this last example, the properties, the links, and the attachments are also retrieved. If you are familiar with the Fusion Lifecycle API, then you know that the attachments are a separate API call. But we do not want to deal with such complexity on the Vault Client side. We simply want the desired data. The powerGate Server Plugin should take care of handling this and other complexities.

In order to create a Fusion Lifecycle item, the mandatory and/or editable properties must be set. Here’s an example

$properties = @()
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='NUMBER';Value='01-000-0001'}
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='TITLE';Value='coolOrange powerGate test'}
$newItem = New-ERPObject -EntityType "FusionLifecycleItem" -Properties @{WorkspaceId=8;Properties=$properties}
Add-ERPObject -EntitySet "FlItems" -Properties $newItem

Files can be uploaded with this line

Add-ERPMedia -EntitySet "FlFile" -File C:\temp\test.pdf -Properties @{WorkspaceId=93;Id=7660;FileName="test.pdf";Description="test upload from powerGate"}

As you can see, the path to the local file, the ID of the workspace and the item to which the file is to be uploaded are specified. The complexity of making a matching HTTP call, using cookies, content-type, converting the file to appropriate bytes, etc. is handled by the powerGate server plugin.

A sample PowerShell Script, the powerGate plugin and the source code can be downloaded here https://github.com/coolOrange-Public/powerGateFusionLifecycle from GitHub. We hope you enjoy this new powerGate extension. We will post more about Vault-Fusion Lifecycle in the future.

Posted in Fusion Lifecycle, powerGate | Leave a comment

Vote for AU Las Vegas classes

Whether you’re going to AU Las Vegas or not, you can and should vote for the classes. Here is the link http://au.autodesk.com/speaker-resource-center/call-for-proposals/voting. As you know, we are Data Management guys, so we would love if you could especially give a look to the Data Management and Product Lifecycle Management topics, but also filter by Vault or Fusion Lifecycle. Most of the content will be made available after AU, so voting for your preferred classes is worth also if you will not attend.

Despite the general invite to vote for classes, we are obviously advertising our own cause as well. So, in case you like to have coolOrange people on stage, then vote up our classes. This year, we submitted several classes on several topics, and here is the list:

  • What’s New in the Vault API with examples you can use
  • Custom Reporting in Vault 2019 – Dress up your Vault data to meet the world!
  • Connect Vault with ERP – items, bill of materials, files and more…
  • Connect Vault with Fusion Lifecycle – expand your engineering workflows
  • Enhance Vault workflows, with Vault client events
  • How to load data into your Vault – Data Transfer Utility

The first session is for coders, while the further sessions will have interesting insights for coders as well but will also provide samples and insights for CAD admin, IT admins and Vault resellers of course.

So, if you like these sessions, then please vote them up, but also give a look to the other available sessions and express your preference. The voting will close on July 13th, so you still have some weeks to go, but the best would be if you take 10 min. now, while you read this blog post.

Thanks for your engagement, and we will hopefully see you in Las Vegas!!

Posted in Uncategorized | Leave a comment

Datastandard – Cool custom dialog from menu

Let’s create your own custom dialog without any help of Datastandard and still with PowerShell!Webp.net-gifmaker (3)

 

 

The issue with dialogs from Datastandard

I am sure you have faced a very similar situation like I did:
My initial goal was to create a very simple dialog for just showing some data when the users clicks on my context menu entry.

But if you are using the dialogs from Datastandard they come with some logic, depending on the method you call, some examples:

  1. GetCreateDialog($folderId): It creates always a new file and you have to specify a Template
  2. GetEditFolderDialog($folderId): It creates a new folder or updates the properties

The other dialogs also have similar features, but what if you don’t need any of those features?

 

The coolDialog as solution

1. Create your XAML

First, we have to provide a XAML file (What is that?) with our needs which we can later load in PowerShell.
The root must be the Window element and here is an example of “The coolDialog”.

2. Load your XAML in PowerShell

If you have now a valid XAML file with the <Window> as root, then you can load it with one fo the overloads of the static XamlReader.Load methods, like this snippet:

[xml]$xamlContent = Get-Content "$($env:ProgramData)\Autodesk\Vault 2018\Extensions\DataStandard\Vault.Custom\addinVault\Menus\TheCoolDialog.xaml"
$theCoolWindow = [Windows.Markup.XamlReader]::Load((New-Object System.Xml.XmlNodeReader -ArgumentList @($xamlContent)))

If you execute this snippet it will NOT show the window! For accomplishing that you need to call ShowDialog().
After you copy this code you need to replace the path to your XAML file!

3. Fill the dialog with data

Of course, you want to fill your dialog with some dynamic data. In my example of the coolDialog there is a TextBox where the Text is bound to the property FullName: This means the TextBox expects to have access to an object with the Property FullName

After the first 2 steps the dialog has no data, therefore the Text will be empty.

Now, we give an object, in my case a Vault Folder to the DataContext of the parent control:

    $folderId = $vaultContext.CurrentSelectionSet | select -First 1 -ExpandProperty "Id"
    if(-not $folderId) {
        throw "No valid folder selected!"
    }
    $fldr = $vault.DocumentService.GetFolderById($folderId)
    $folder = New-Object Autodesk.DataManagement.Client.Framework.Vault.Currency.Entities.Folder -ArgumentList @($vaultConnection, $fldr)
    $theCoolWindow.FindName("StackPanelCoolView").DataContext = $folder

The folder has a property FullName and therefore that value will be shown in the TextBox.

4. Handle events

I am pretty sure you want to have something happen when the customer clicks a button, therefore we look to handle events now.

In PowerShell, you can subscribe to every available event of a control by calling the method add_<EventName>($scriptBlock).

Let’s extend our coolDialog so it will close when clicking on the Close Button:

    $theCoolWindow.FindName("BtnCancel").add_Click({
        $theCoolWindow.Close()
    })

Tip: If you want to see all available events for a certain control, like for a Button just google “wpf button” and goto the related msdn page.

5. Bring your window to life

The last step is to let the window appear as a Graphical User Interface visible to the customer.
This can be easily accomplished by calling ShowDialog():

$theCoolWindow.ShowDialog() | Out-Null

Why do I pipe it through Out-Null you ask? Because ShowDialog returns a result and I don’t handle it. If you want to handle depending on how the user exited the dialog read these Remarks.

The full version of the PowerShell script for the coolDialog you can download here.
But remember this example was programmed for a menu entry in Datastandard for Vault 2018.2
You will find all downloads at the end of this post!


Troubleshooting

Props[], PSCmd[], PSVal.. do not work in my XAML

All of Datastandard features what you usually use to bind in your Dialog are NOT working in a custom dialog. Here you can see all the Binding which are affected!

 

Events not working (Vault crash?)

I could identify why, but sometimes the scriptBlock of the event has no access to the variables outside of that scope.
The solution is to mark the variables as global which are initialized out of the event and are also used within the Event:

$global:theCoolWindow = [Windows.Markup.XamlReader]::Load((New-Object System.Xml.XmlNodeReader -ArgumentList @($xamlContent)))
$theCoolWindow.FindName("BtnCancel").add_Click({
        $global:theCoolWindow.Close()
    })

It depends on your environment but this could be also a cause that your Application crashed.

ShowDialog() vs Show()

 

Do not call Show(), it will work in the first place by showing your window. BUT every $scriptBlock triggered by an event will throw an exception and will sometimes crash your Application.
Be sure to use the ShowDialog() method!


 

Whatever, I congratulate you to your custom dialog where you have the total freedom of adding controls and the logic behind!

 

Happy developing,

Patrick

 

Downloads

This example was programmed for a menu entry in Datastandard for Vault 2018.2

Posted in Data Standard | Leave a comment

Load – Enhance – Connect

Over the past years, we developed products and technologies that made your Autodesk Data Management projects even more successful. We touched different topics and build products, technologies and services around it. The result of our work can be summarized in the three themes Load – Enhance – Connect, and these three themes become the core message of our new web site http://www.coolorange.com. Have a look and let us know what you think.

Let’s talk about Load: Vault comes with a cool little tool called DTU (Data Transfer Utility, aka VaultBCP), which allows to import data into Vault in bulks. DTU can be used to import small and large data sets, with files, history, items, BOMs, links, etc. It can be used to import files from file system, to migrate from other Data Management systems, to merge Vaults, and the like. We’ve created a tool kit that makes it simple to generate, manipulate, merge and validate BCP packages, so that any sort of import situation can be handled in a reliable and predictable way. The bcpToolkit can do already a lot and will grow even more later this year. Additionally, to the tools, we also put our experience at your service, so, if you are facing any Vault import, migration or merge challenge, then reach out to us.

Under Enhance we have combined powerJobs and powerEvents. While powerJobs enhances your Vault Jobprocessor with ready to use jobs and the ability to create your own jobs, powerEvents enhances your Vault client by giving you the ability to improve your workflows. Both products will enhance your processes and make Vault behave the way that makes sense to you. As an example, you can say “release only if …” and define in powerEvents the logic you are looking for. Both, powerJobs and powerEvents can be configured via Microsoft PowerShell scripting language, so little coding experience is way enough for bringing Vault to the next level.

Connecting Vault with other systems is the next logical step. Once you have your data and processes under control within your engineering department, it’s time to connect with the rest of the company. The connection to ERP is the obvious step, and with powerGate many ERP systems from SAP, over Microsoft to also smaller ERP Systems have been connected. Meanwhile connecting to cloud is also a topic, which can be addressed with powerGate as well.

The new website talks more about your situations and less about product features, so we hope that you’ll get a better understanding on what the products can do for you. We are quite excited about the new website, and hope you will enjoy it too.

 

 

Posted in Uncategorized | Leave a comment

#19 version available

Over the past couple of months we have been a bit quiet, and I’d like to apologise for that. We have been busy working on several topics, which we will present in the next weeks and months.

Today we proudly announce the release of the 2019 products. Actually, as you can see from the banner picture, we call it just 19. As for the past versions, with the 19 products, we support the Autodesk 2019, 2018 and 2017 versions. So, the latest and the two preceding versions. This means you have for example powerJobs 19 for Vault 2019, 2018 and 2017. Regardless on which version of Vault you are, you get the most recent coolOrange product version with all new features and enhancements.

The complete product line have been released, including powerJobs, powerVault, powerGate, dataLoader, vaultRuler, mightyBrowser and all other apps we have. Only exception is the bcpToolkit. We are working on supporting the BCP 2019 format and will release our products around DTU (Data Transfer Utility) very soon. There will be some exciting news there as well, so stay tuned.

In order to simplify and speed up the download process, we moved our products to Amazon S3 and created a dedicated domain. So, if you go on http://download.coolorange.com, you will find all the download links. There you will also find the archive of previous versions, so if you are running an oder version of the Autodesk products and still need the according coolOrange product, you can find it there. As mentioned, we officially support the current release and the two preceding versions, but we understand that if you did not upgraded yet and still need to setup your machine, you still need access to legacy versions. Be aware, that those versions are not supported, however, they worked well so far, so you may take the risk.

So, what’s new? Despite the support for the latest Autodesk applications, powerJobs now officially supports the newly introduced InventorServer. This does not require an Inventor license, if used in combination with Vault. So, now you can create PDFs, DXF, DWG and other formats via powerJobs, without the need of an Inventor license.
Also, we made some important internal architectural changes, which may not be that visible to you, but are important for the ongoing technology changes. For instance, we moved to PowerShell 4 as a minimum requirement, as there are several important benefits that the products, and finally you, can take advantage from.
One relevant change for you, is that all our PowerShell based products do no longer require a special shortcut for starting the ISE (script editor). You can start whichever script editor you like, PowerShell ISE, Visual Studio Code, or the like, and just load the according PowerShell module via Import-Module.
There are more changes, which you can read on our wiki page via http://www.coolorange.com/wiki, in the according product change log section. We will talk about more enhancements in the coming weeks and months.

We wish you fun and success with the coolOrange 19 products, regardless which Autodesk product version you run.

Posted in Uncategorized | Leave a comment

Couple of updates

Last week, we released new versions of powerVault, powerJobs and powerEvents. While the changes on powerVault are smaller bug fixes, and here you can find the details, for powerJobs there is an interesting improvement related to the Inventor project file.

So far, powerJobs always sets the project file for each job, even though the project file is already set. This is not an issue, until you have a situation where the Inventor session used by powerJobs, has an open file. As you know, setting a project file while there is a file open in Inventor, it’s not possible. Therefore, powerJobs fails to execute the job, because the attempt to set the project file fails. This is now solved. If the project file of the Inventor sessions is already the right one, then there is no need to change that. Also, if the project file must be changed and there is a file open, then the job will fail with a human readable error message.

In order to prevent this problem, we suggest tweaking your logic in the script. Make sure that the open, convert (or whatever you do with the file) and close operations are very close together. If you need for example to copy the PDF file to a network share, then do it after the file close. This way, if for which ever reason the PDF copy to network fails, then the original file has been already close. If you do the copy operation before closing the original file, in the case where the copy operation fails, your job will fail and the close would not be executed. You end up having an open file in your Inventor session. So, make sure that your job will always close the file before you do other risky actions.

We also released a new version of powerEvents. This version now fully supports Change Orders and Custom objects, but even more important, it recognizes changes on the scripts on the fly. Whenever you add, remove or change a file in the powerEvents folders, all the registered events will be removed and re-added, even while your Vault client is open. This way, you can make changes as you want, and test your changes immediately without the need to restart Vault. Also, you can now unregister events programmatically. So, you may sign up for an event just temporary and then unregister from the event. There are now many scenarios that can be accomplished.

powerJobs and powerEvents are embedding the new version of powerVault. By updating powerJobs or powerEvents, you will automatically get the latest version of powerVault. In case you have the need just for powerVault, then of course you can update just that.

Posted in powerEvents, powerJobs, powerVault | Leave a comment