Load – Enhance – Connect

Over the past years, we developed products and technologies that made your Autodesk Data Management projects even more successful. We touched different topics and build products, technologies and services around it. The result of our work can be summarized in the three themes Load – Enhance – Connect, and these three themes become the core message of our new web site http://www.coolorange.com. Have a look and let us know what you think.

Let’s talk about Load: Vault comes with a cool little tool called DTU (Data Transfer Utility, aka VaultBCP), which allows to import data into Vault in bulks. DTU can be used to import small and large data sets, with files, history, items, BOMs, links, etc. It can be used to import files from file system, to migrate from other Data Management systems, to merge Vaults, and the like. We’ve created a tool kit that makes it simple to generate, manipulate, merge and validate BCP packages, so that any sort of import situation can be handled in a reliable and predictable way. The bcpToolkit can do already a lot and will grow even more later this year. Additionally, to the tools, we also put our experience at your service, so, if you are facing any Vault import, migration or merge challenge, then reach out to us.

Under Enhance we have combined powerJobs and powerEvents. While powerJobs enhances your Vault Jobprocessor with ready to use jobs and the ability to create your own jobs, powerEvents enhances your Vault client by giving you the ability to improve your workflows. Both products will enhance your processes and make Vault behave the way that makes sense to you. As an example, you can say “release only if …” and define in powerEvents the logic you are looking for. Both, powerJobs and powerEvents can be configured via Microsoft PowerShell scripting language, so little coding experience is way enough for bringing Vault to the next level.

Connecting Vault with other systems is the next logical step. Once you have your data and processes under control within your engineering department, it’s time to connect with the rest of the company. The connection to ERP is the obvious step, and with powerGate many ERP systems from SAP, over Microsoft to also smaller ERP Systems have been connected. Meanwhile connecting to cloud is also a topic, which can be addressed with powerGate as well.

The new website talks more about your situations and less about product features, so we hope that you’ll get a better understanding on what the products can do for you. We are quite excited about the new website, and hope you will enjoy it too.

 

 

Posted in Uncategorized | Leave a comment

#19 version available

Over the past couple of months we have been a bit quiet, and I’d like to apologise for that. We have been busy working on several topics, which we will present in the next weeks and months.

Today we proudly announce the release of the 2019 products. Actually, as you can see from the banner picture, we call it just 19. As for the past versions, with the 19 products, we support the Autodesk 2019, 2018 and 2017 versions. So, the latest and the two preceding versions. This means you have for example powerJobs 19 for Vault 2019, 2018 and 2017. Regardless on which version of Vault you are, you get the most recent coolOrange product version with all new features and enhancements.

The complete product line have been released, including powerJobs, powerVault, powerGate, dataLoader, vaultRuler, mightyBrowser and all other apps we have. Only exception is the bcpToolkit. We are working on supporting the BCP 2019 format and will release our products around DTU (Data Transfer Utility) very soon. There will be some exciting news there as well, so stay tuned.

In order to simplify and speed up the download process, we moved our products to Amazon S3 and created a dedicated domain. So, if you go on http://download.coolorange.com, you will find all the download links. There you will also find the archive of previous versions, so if you are running an oder version of the Autodesk products and still need the according coolOrange product, you can find it there. As mentioned, we officially support the current release and the two preceding versions, but we understand that if you did not upgraded yet and still need to setup your machine, you still need access to legacy versions. Be aware, that those versions are not supported, however, they worked well so far, so you may take the risk.

So, what’s new? Despite the support for the latest Autodesk applications, powerJobs now officially supports the newly introduced InventorServer. This does not require an Inventor license, if used in combination with Vault. So, now you can create PDFs, DXF, DWG and other formats via powerJobs, without the need of an Inventor license.
Also, we made some important internal architectural changes, which may not be that visible to you, but are important for the ongoing technology changes. For instance, we moved to PowerShell 4 as a minimum requirement, as there are several important benefits that the products, and finally you, can take advantage from.
One relevant change for you, is that all our PowerShell based products do no longer require a special shortcut for starting the ISE (script editor). You can start whichever script editor you like, PowerShell ISE, Visual Studio Code, or the like, and just load the according PowerShell module via Import-Module.
There are more changes, which you can read on our wiki page via http://www.coolorange.com/wiki, in the according product change log section. We will talk about more enhancements in the coming weeks and months.

We wish you fun and success with the coolOrange 19 products, regardless which Autodesk product version you run.

Posted in Uncategorized | Leave a comment

Couple of updates

Last week, we released new versions of powerVault, powerJobs and powerEvents. While the changes on powerVault are smaller bug fixes, and here you can find the details, for powerJobs there is an interesting improvement related to the Inventor project file.

So far, powerJobs always sets the project file for each job, even though the project file is already set. This is not an issue, until you have a situation where the Inventor session used by powerJobs, has an open file. As you know, setting a project file while there is a file open in Inventor, it’s not possible. Therefore, powerJobs fails to execute the job, because the attempt to set the project file fails. This is now solved. If the project file of the Inventor sessions is already the right one, then there is no need to change that. Also, if the project file must be changed and there is a file open, then the job will fail with a human readable error message.

In order to prevent this problem, we suggest tweaking your logic in the script. Make sure that the open, convert (or whatever you do with the file) and close operations are very close together. If you need for example to copy the PDF file to a network share, then do it after the file close. This way, if for which ever reason the PDF copy to network fails, then the original file has been already close. If you do the copy operation before closing the original file, in the case where the copy operation fails, your job will fail and the close would not be executed. You end up having an open file in your Inventor session. So, make sure that your job will always close the file before you do other risky actions.

We also released a new version of powerEvents. This version now fully supports Change Orders and Custom objects, but even more important, it recognizes changes on the scripts on the fly. Whenever you add, remove or change a file in the powerEvents folders, all the registered events will be removed and re-added, even while your Vault client is open. This way, you can make changes as you want, and test your changes immediately without the need to restart Vault. Also, you can now unregister events programmatically. So, you may sign up for an event just temporary and then unregister from the event. There are now many scenarios that can be accomplished.

powerJobs and powerEvents are embedding the new version of powerVault. By updating powerJobs or powerEvents, you will automatically get the latest version of powerVault. In case you have the need just for powerVault, then of course you can update just that.

Posted in powerEvents, powerJobs, powerVault | Leave a comment

Complete your Vault data

2017-07-14_09-40-17This topic came up quite frequently in the last months. You have your data in Vault, but for which ever reason, there is still other data outside of Vault that shall be added to existing Vault files, folders, items, or other objects. You have the data as CSV (or Excel), but don’t know how to import this into Vault. Well, for the files you can use our dataLoader. It’s a simple and powerful tool that allows you to pick a CSV file, it does the mapping of properties and files, and then updates the file properties. It also keeps track of issues during the update process, so that you can repeat the operation for the problematic files to a later point in time.

While the dataLoader gives you comfort, performance, and other benefits, there might be situations where you need to update just a small amount of files or other objects like folders or items.

As you know we love scripting, so updating properties based on a CSV file is something that can be done via a PowerShell script. As an example, if you want to update file properties, you can take advantage of powerVault (free) and have a script like this:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$files = @()
do {
	$files += $vault.DocumentService.FindFileFoldersBySearchConditions($srcConds, $null, $null, $true, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($files.Count -lt $searchStatus.TotalHits)

$csv = Import-Csv "c:\temp\fileProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating files..." -CurrentOperation $row.FileName -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FileName")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$file = $files | Where-Object { $_.File.Name -eq $row.FileName }
	$fullPath = $file.Folder.FullName + '/' +$file.File.Name
	$file = Update-VaultFile -File $fullPath -Properties $data
}

This is a sample CSV file for this script:

FileName;Title
100083.ipt;updated via script
100084.ipt;updated via script

This script searches for all the files in Vault, reads the CSV file, and for each line in the CSV, it updates the according properties. This script is kept pretty simple, so for instance loading all the files might not be a good idea. If you know that you just need Inventor files, then it would be good to enhance the filter criteria in the search, or if you just look for files in a given folder, then you can specify this in the FindFileFoldersBySearchConditions function. If you like to know which search criteria to use, then you can easily perform a search in Vault and use vapiTrace to figure out the parameters. We need to search for the files, as we need the full path in order to work with the Update-VaultFile command-let. However, if you have the full path in your CSV, than searching the files can be omitted.

Also, this script does not have any sort of error handling, so in case the file could not be found, or the file is checked-out or permissions are missing, etc., the update will just fail. Obviously the script can be enhanced in order to take care also of these situations.

The same applies to items. Here is the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$csv = Import-Csv "c:\temp\itemProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating items..." -CurrentOperation $row.ItemNumber -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "ItemNumber")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$item = Update-VaultItem -Number $row.ItemNumber -Properties $data
}

In this case, it’s even simpler. We don’t need to search for the items. If you know the number, then you can update them right away. There are some things to take into account. You can just update user defined properties via the argument –Properties. The title for instance, is a system property, which can be updated via the special argument -Title.

Another example, is updating the properties of folders. Here the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$folders = @()
do {
	$folders += $vault.DocumentService.FindFoldersBySearchConditions($srcConds, $null, $null, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($folders.Count -lt $searchStatus.TotalHits)

$propDefs = $vault.PropertyService.GetPropertyDefinitionsByEntityClassId("FLDR")

$csv = Import-Csv "c:\temp\folderProperties.csv" -Delimiter ';'
foreach($row in $csv)
{
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FolderName")
		{
			$propDef = $propDefs | Where-Object { $_.DispName -eq $col.Name }
			$data[$propDef.Id] = $col.Value
		}
	}
	$propValues = New-Object Autodesk.Connectivity.WebServices.PropInstParamArray
	$propValues.Items = New-Object Autodesk.Connectivity.WebServices.PropInstParam[] $data.Count
	$i = 0
	foreach($d in $data.GetEnumerator()) {
		$propValues.Items[$i] = New-Object Autodesk.Connectivity.WebServices.PropInstParam -Property @{PropDefId = $d.Key;Val = $d.Value}
		$i++
	}
	$folder = $folders | Where-Object { $_.Name -eq $row.FolderName }
	$vault.DocumentServiceExtensions.UpdateFolderProperties(@($folder.Id),@($propValues))
}

Here, like for the files, we search for all folders (you can enhance the search), load all the rows from the CSV and then perform the property update. Unfortunately, powerVault does not offer a Update-VaultFolder yet, so we have to do it via regular Vault API, which makes the script more cumbersome and longer.
In this case, we have to get all the properties first, as we cannot work with the property name. We need the Vault internal property ID. Then, we have to build the PropInstParamArray argument for the UpdateFolderProperties API function. So, we cycle through all the CSV columns, except for the FolderName, and build our list of property IDs and values.

The sample for the folder property update can be also applied to custom objects and change order. Obviously the function names and arguments will change, but the logic remains the same. Via vapiTrace, you can work in Vault and see which arguments and functions are needed for performing the search and the property update.

Bottom line, you have plenty of options for doing a mass update of your data. The coolOrange dataLoader is definitively the first choice when it comes to update file properties in a secure and reliable way. However, if the amount of data is small or you want to update other objects, then some line of scripting will do too.

We hope you enjoy!!

Posted in Migration, PowerShell, powerVault, Vault API | Leave a comment

Fixing the BOM blob

With Vault 2017, Autodesk introduced a new job type: autodesk.vault.extractbom.inventor. This job fixes the file BOM blob. The file BOM blob is an object stored in the Vault database on each CAD file record. It is created by the Vault CAD add-in, which saves the CAD BOM information on each check-in into Vault. When you perform an “Assign-Item”, the file BOM blob is used for creating the items and the according bill of material. If the file BOM blob is missing or corrupt, then the assign-item will just create the master item and either no BOM or a wrong BOM.

This is where the new job comes in. You can queue a job for the Inventor files (iam, ipt) with issues, and the job will recreate the file BOM blob object for such files. In order to update the BOM blob object in the Vault database, a new API have been introduced: SetBomByFileId. This API allows to update the file BOM blob, without creating a new version.

To be fair, the BOM blob is usually OK, so you don’t need this job on a day-by-day situation. However, if you imported data/files from a legacy system via BCP and you did not create the BOM blobs, then you will experience that the assign-items will not work, and you have to open the assemblies and components one by one and save them back into Vault in order to fix the problem.

The question now is how to queue this new job. You can use the LifecycleEventEditor and queue the job on lifecycle change, or with Vault 2018 you can set this up on the lifecycle transition settings. I’d like to show you how to queue the job via a VDS context menu or with a stand alone PowerShell script. This would give you the ability to queue the job for a selection of files or the complete list of Inventor files.

Let’s create a custom menu item via Vault Data Standard. We start by adding the menu entry in the MenuDefinitions.xml. Here is the menu item

<cOqueueExtractBomJob Label="Fix BOM" Description="Fix BOM" Hint="Fix BOM" PSFile="queueExtractBomJob.ps1" Image="coolOrange.ico" ToolbarPaintStyle="TextAndGlyph" NavigationTypes="File" MultiSelectEnabled="True" />

And here is how to edit the MenuDefinitions.xml:

The according PowerShell script, which goes to C:\ProgramData\Autodesk\Vault 2017\Extensions\DataStandard\Vault\addinVault\Menus\queueExtractBomJob.ps1 looks like this:

$files=$vaultContext.CurrentSelectionSet
foreach($file in $files)
{
  Add-VaultJob -Name "autodesk.vault.extractbom.inventor" -Description "Fix BOM for '$($file.Label)'" -Priority LOW -Parameters @{"EntityClassId"="File";"FileMasterId"=$file.Id}
}

As you can see, it takes the selection, and for each file it adds a job. Wait, what is Add-VaultJob??? It’s a new command-let that have been introduced in powerVault just few days ago. As the name says, it allows you to queue a job in a super simple way. Just pass the name of the job, the description, the priority and the parameters, and you are done! So, restart Vault, select some Inventor files, and trigger the function. You will see that jobs have been added to the queue.

OK, let’s now create the stand alone script. We will use powerVault again, for queuing a huge amount of jobs. We will perform a search via the Vault API and queue the job. The script looks like this:

Import-Module powerVault
Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""
$srchCond = New-Object Autodesk.Connectivity.Webservices.SrchCond
$srchCond.PropDefId = 33
$srchCond.PropTyp = "SingleProperty"
$srchCond.SrchOper = 1
$srchCond.SrchRule = "Must"
$srchCond.SrchTxt = "ipt"

$bookmark = ""
$srchStatus = New-Object Autodesk.Connectivity.Webservices.SrchStatus
$counter = 0
while($bookmark -eq "" -or $counter -lt $srchStatus.TotalHits)
{
  $files = $vault.DocumentService.FindFilesBySearchConditions(@($srchCond),$null,$null, $true, $true, [ref]$bookmark, [ref]$srchStatus)
  foreach($file in $files)
  {
    $job = Add-VaultJob -Name "autodesk.vault.extractbom.inventor" -Description "Fix BOM for '$($file.Name)'" -Priority LOW -Parameters @{"EntityClassId"="File";"FileMasterId"=$file.MasterId}
    $counter++
  }
}

It imports powerVault and connects to Vault. Adapt the connections information as needed. It then defines a search condition, where the file extension contains ipt. In my case, the PropDefId (which is the internal property definition ID) for File Extension, is 33. In your Vault, this might be different. In order to find the PropDefId, you might use the Vault API or, like I did, perform a search in Vault with your preferred criteria and use vapiTrace for spying the arguments, like here:

Back to the script, we then make a look and search over and over again for the files. The loop is necessary, as the search always returns just 100 records, so we have to cycle through this several times in order to hit all. Within the loop, we queue the job like before.

In the code sample above, we perform the search for IPTs. In order to fix all the BOM issues, you will have to queue the job for IAMs and IPTs. The BOM blob is quite complex, and it relies on the child components, so also the children must have a correct BOM blob.

So, now you can import your files in Vault, maybe via BCP, don’t care about the file BOM blob, and use the Autodesk job for fixing it. Via powerVault this becomes super simple.

 

Posted in Data Standard, PowerShell, powerVault, Vault BCP | 2 Comments

VDS Setup

Wouldn’t it be cool to create a setup for your own Data Standard customizations? Yes, a setup with a version, secure to install and easy to deploy, and simple to upgrade? It’s easier than you think. For this purpose, we use the open source WIX toolset. We need a batch-file like this

"c:\Program Files (x86)\WiX Toolset v3.11\bin\heat.exe" dir ..\VDS -cg VDSGroup -out VDSGroup.wxs -gg -srd -dr DataStandardFolder
"c:\Program Files (x86)\WiX Toolset v3.11\bin\candle.exe" VDSGroup.wxs
"c:\Program Files (x86)\WiX Toolset v3.11\bin\candle.exe" Customer1.wxs -dPVersion=1.0.1
"c:\Program Files (x86)\WiX Toolset v3.11\bin\light.exe" Customer1.wixobj VDSGroup.wixobj -out Customer1.msi -b ..\VDS
del VDSGroup.wxs
del *.wix*
pause

and a master XML instruction file like this:

<Wix xmlns='http://schemas.microsoft.com/wix/2006/wi'>
  <?define UpgradeCode="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
  <?define ComponentGUID="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
  <Product Name='Customer1 VDS Setup' Manufacturer='coolOrange' Id='*' UpgradeCode='$(var.UpgradeCode)' Version='$(var.PVersion)' Language='1033' Codepage='1252'>
    <Package Id='*' Description="Customer1 VDS Installer" Manufacturer='coolOrange' Keywords='Installer' InstallerVersion='100' Languages='1033' Compressed='yes' SummaryCodepage='1252'/>
    <Upgrade Id='$(var.UpgradeCode)'>
      <UpgradeVersion OnlyDetect='no' Property='PREVIOUSFOUND' Minimum='1.0.0' IncludeMinimum='yes' Maximum='$(var.PVersion)' IncludeMaximum='no'/>
    </Upgrade>
    <Media Id='1' Cabinet='Sample.cab' EmbedCab='yes' DiskPrompt='CD-ROM #1'/>
    <Property Id='DiskPrompt' Value="Customer VDS Installation [1]"/>
    <Directory Id='TARGETDIR' Name='SourceDir'>
      <Directory Id='CommonAppDataFolder' Name='ProgramData'>
        <Directory Id='AutodeskFolder' Name='Autodesk'>
          <Directory Id='VaultFolder' Name='Vault 2017'>
            <Directory Id='ExtensionFolder' Name='Extensions'>
              <Directory Id='DataStandardFolder' Name='DataStandard'>
                <Component Id='DataStandardFolderComponent' Guid='$(var.ComponentGUID)'>
                  <CreateFolder/>
                </Component>
              </Directory>
            </Directory>
          </Directory>
        </Directory>
      </Directory>
    </Directory>
    <Feature Id='Complete' Level='1'>
      <ComponentGroupRef Id='VDSGroup'/>
      <ComponentRef Id='DataStandardFolderComponent'/>
    </Feature>
    <InstallExecuteSequence>
      <RemoveExistingProducts Before="InstallInitialize"/>
    </InstallExecuteSequence>
  </Product>
</Wix>

Ok. Let’s make a step back. You diligently store your Data Standard customizations into a dedicated folder, right? That folder has the same structure as the Data Standard folder structure, something like this

Therefore, you could just take what you have in your clean customization folder, and compile it into a simple setup. For this purpose, we use 3 tools from the WIX toolset: heat.exe for gathering the folders/files information, candle.exe for compiling the information into an object (binary) file, and light.exe for linking the object files and the real files into a msi setup.

In my example, I have under Customer1, a Setup and the VDS folder. The VDS folder contains the sub-folders with the according VDS files. The setup folder, contains the setup files.

As you can see, we have the make.bat (see the first code block above) and the Customer1.wxs (see xml code above). Now, if you install the WIX toolset and create the same structure as in this example, you can run the make.bat, and you will get a customer1.msi that you can install.

OK, let’s add some more info here. Let’s start with the batch-file that we call make.bat. As you can see in the code, the batch file first collects all the folders and files within the VDS folder via the tool heat.exe

heat.exe" dir ..\VDS -cg VDSGroup -out VDSGroup.wxs -gg -srd -dr DataStandardFolder

The first argument dir should be clear. It’s where the heat.exe should start collecting information. In our case, it’s one folder up (..\) and then in the VDS folder. The second argument will give a kind of title to the collected information. We will call it VDSGroup. The result of the gathering shall be saved into the file VDSGroup.wxs. The last argument is a placeholder we will use later.

The next line is

candle.exe" VDSGroup.wxs

which compiles the information generated by the heat.exe in the line before. The next one is again a candle.exe, but with our master file, which we will talk about in a minute.

candle.exe" Customer1.wxs -dPVersion=1.0.1

What you see here, is that we pass the version for our setup as an argument. So, just increase the version, and you can update your previous setup. The pervious version will be removed and replaced with the new one.

Finally, we have

light.exe" Customer1.wixobj VDSGroup.wixobj -out Customer1.msi -b ..\VDS

which takes the compiled Customer1.wixobj and VDSGroup.wixobj files and link them together into a msi setup. The light.exe also takes the original files into the setup, therefore, we need to tell him where the files are (-b ..\VDS), like we did for the heat.exe in the first step.

Let’s now have a look to the Customer1.wxs, you have the XML code above. In this file, we have information about the setup name and the author, so, just go ahead and change the Customer1 with your customer or project name, and replace coolOrange with your company name. In the very first two lines, you’ll see

<?define UpgradeCode="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
<?define ComponentGUID="041e09fb-6cde-431f-8955-d9a019cb8a35"?>

The sequence of number here is called GUID and identifies uniquely this setup for future updates. You need your own GUID, so please go to http://www.guidgenerator.com and generate a new GUID for your setup. DO NOT USE THESE GUIDs!!!!!!!! Have you seen all the exclamation marks?? DON’T USE THESE GUIDs! MAKE YOUR OWN, and replace the one in the file with yours. Get a new GUID for each new project-setup you will make.

I’m skipping most of the elements, as they are good this way, so let’s jump to the Directory elements.

<Directory Id='TARGETDIR' Name='SourceDir'>
  <Directory Id='CommonAppDataFolder' Name='ProgramData'>
    <Directory Id='AutodeskFolder' Name='Autodesk'>
      <Directory Id='VaultFolder' Name='Vault 2017'>
        <Directory Id='ExtensionFolder' Name='Extensions'>
          <Directory Id='DataStandardFolder' Name='DataStandard'>
            ......

This section will define the folder structure where the setup will install the files. Each Directory entry must have his unique ID. The second Directory element points to the ProgramData folder, and all the following Directory entries will be the physical sub-folders. Note the almost last Directory entry? It’s called DataStandardFolder. If you remember, in the batch-file, the last argument of the heat.exe is also DataStandardFolder. This way, the data collected by heat, will be added to our folder structure at this point. So, the sub-folder collected by heat.exe will become the sub-folders of our folder structure in the Customer1.wxs.

So, this Customer1.wxs is already quite good for covering the needs in order to create a VDS setup. Just check that your folder structure points to the appropriate Vault version (2016, 2017, 2018, etc.), set your GUIDs, change the description and you are good to go!

Every time you like to create a new version of the setup, just increase the version in the batch-file, and run the make.bat. You will have a new setup, with according version number. Make some tries. You will see how by installing a new version of the setup, old files will be removed and new files installed.

WIX is highly capable, so if you like to create more fancy setups, you can walk though the tutorial posted here. This blog post here covers just the basic, but you can go further. For example, we create client side setups that contains VDS files, powerEvents files, and other client side stuff. And we also create setups for the powerJobs, just applying the same logic. In the case where you have different sources that you like to combine into one setup, then you can have the heat.exe executed several times, compile the wxs files via candle, and then combine them all into the light.exe, by passing all the object files and adding all the source folders via the –b switch (-b source1 –b source2 –b source3). Good luck!

 

Posted in Data Standard | 2 Comments

Tracing the Vault API

Picasso ones said, „Good Artists Copy; Great Artists Steal”. You like to write some Vault API code, in .Net or PowerShell, and are fighting with the right sequence of commands and appropriate arguments? Why not click the sequence in the Vault client and copy the code generated by the Vault client?

The Vault client communicates via HTTP (Web Services – SOAP) with the Vault server. Tools like Fiddler can monitor the HTTP traffic, so that you can spy what the Vault client is talking to the Vault server and learn from that. Doug Redmond wrote years ago this blog. However, the SOAP communication is quite verbose, and because is not plain XML (it’s multipart), the Fiddler cannot show you the content in a nice, readable way.

Good news, Fiddler can be extended. We did this and created the vapiTrace (Vault API trace)

Just copy the vapiTrace extension to the C:\Program Files (x86)\Fiddler2\Inspectors folder. Start Filler, and you should see the vapiTrace tabs. Now, when you login into Vault (don’t use localhost), you’ll start seeing the client-server traffic and by clicking on the single communication lines, the vapiTrace tabs will show the request and the response in a simpler and readable way.

Let’s say you want to know how to create a new Vault folder via the API. Just go in Vault and create the folder manually and then pick in Fiddler the according command, and see which Vault API have been called, with which arguments. The best is that you can now delete your folder in Vault, and try to create it via API by using the same function with same arguments. Just keep in mind, that only arguments with values are transferred. So, it might be that your Vault API function requires 5 arguments, but in Fiddler you just see 3. This means that the two missing arguments will be null.

Now, Fiddler will trace any HTTP connection, therefore you will have a lot of noise. Activate the filter by going on the Filters tab, and set the filter that shows only URLs that contains “AutodeskDM”.

So, now you can experiment with the Vault API by simulating your action in Vault by hand first, and look into the Fiddler tracing how Vault is doing it, and you just have to do the same.

Posted in Vault API | 4 Comments