Couple of updates

Last week, we released new versions of powerVault, powerJobs and powerEvents. While the changes on powerVault are smaller bug fixes, and here you can find the details, for powerJobs there is an interesting improvement related to the Inventor project file.

So far, powerJobs always sets the project file for each job, even though the project file is already set. This is not an issue, until you have a situation where the Inventor session used by powerJobs, has an open file. As you know, setting a project file while there is a file open in Inventor, it’s not possible. Therefore, powerJobs fails to execute the job, because the attempt to set the project file fails. This is now solved. If the project file of the Inventor sessions is already the right one, then there is no need to change that. Also, if the project file must be changed and there is a file open, then the job will fail with a human readable error message.

In order to prevent this problem, we suggest tweaking your logic in the script. Make sure that the open, convert (or whatever you do with the file) and close operations are very close together. If you need for example to copy the PDF file to a network share, then do it after the file close. This way, if for which ever reason the PDF copy to network fails, then the original file has been already close. If you do the copy operation before closing the original file, in the case where the copy operation fails, your job will fail and the close would not be executed. You end up having an open file in your Inventor session. So, make sure that your job will always close the file before you do other risky actions.

We also released a new version of powerEvents. This version now fully supports Change Orders and Custom objects, but even more important, it recognizes changes on the scripts on the fly. Whenever you add, remove or change a file in the powerEvents folders, all the registered events will be removed and re-added, even while your Vault client is open. This way, you can make changes as you want, and test your changes immediately without the need to restart Vault. Also, you can now unregister events programmatically. So, you may sign up for an event just temporary and then unregister from the event. There are now many scenarios that can be accomplished.

powerJobs and powerEvents are embedding the new version of powerVault. By updating powerJobs or powerEvents, you will automatically get the latest version of powerVault. In case you have the need just for powerVault, then of course you can update just that.

Posted in powerEvents, powerJobs, powerVault | Leave a comment

Complete your Vault data

2017-07-14_09-40-17This topic came up quite frequently in the last months. You have your data in Vault, but for which ever reason, there is still other data outside of Vault that shall be added to existing Vault files, folders, items, or other objects. You have the data as CSV (or Excel), but don’t know how to import this into Vault. Well, for the files you can use our dataLoader. It’s a simple and powerful tool that allows you to pick a CSV file, it does the mapping of properties and files, and then updates the file properties. It also keeps track of issues during the update process, so that you can repeat the operation for the problematic files to a later point in time.

While the dataLoader gives you comfort, performance, and other benefits, there might be situations where you need to update just a small amount of files or other objects like folders or items.

As you know we love scripting, so updating properties based on a CSV file is something that can be done via a PowerShell script. As an example, if you want to update file properties, you can take advantage of powerVault (free) and have a script like this:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$files = @()
do {
	$files += $vault.DocumentService.FindFileFoldersBySearchConditions($srcConds, $null, $null, $true, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($files.Count -lt $searchStatus.TotalHits)

$csv = Import-Csv "c:\temp\fileProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating files..." -CurrentOperation $row.FileName -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FileName")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$file = $files | Where-Object { $_.File.Name -eq $row.FileName }
	$fullPath = $file.Folder.FullName + '/' +$file.File.Name
	$file = Update-VaultFile -File $fullPath -Properties $data
}

This is a sample CSV file for this script:

FileName;Title
100083.ipt;updated via script
100084.ipt;updated via script

This script searches for all the files in Vault, reads the CSV file, and for each line in the CSV, it updates the according properties. This script is kept pretty simple, so for instance loading all the files might not be a good idea. If you know that you just need Inventor files, then it would be good to enhance the filter criteria in the search, or if you just look for files in a given folder, then you can specify this in the FindFileFoldersBySearchConditions function. If you like to know which search criteria to use, then you can easily perform a search in Vault and use vapiTrace to figure out the parameters. We need to search for the files, as we need the full path in order to work with the Update-VaultFile command-let. However, if you have the full path in your CSV, than searching the files can be omitted.

Also, this script does not have any sort of error handling, so in case the file could not be found, or the file is checked-out or permissions are missing, etc., the update will just fail. Obviously the script can be enhanced in order to take care also of these situations.

The same applies to items. Here is the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$csv = Import-Csv "c:\temp\itemProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating items..." -CurrentOperation $row.ItemNumber -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "ItemNumber")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$item = Update-VaultItem -Number $row.ItemNumber -Properties $data
}

In this case, it’s even simpler. We don’t need to search for the items. If you know the number, then you can update them right away. There are some things to take into account. You can just update user defined properties via the argument –Properties. The title for instance, is a system property, which can be updated via the special argument -Title.

Another example, is updating the properties of folders. Here the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$folders = @()
do {
	$folders += $vault.DocumentService.FindFoldersBySearchConditions($srcConds, $null, $null, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($folders.Count -lt $searchStatus.TotalHits)

$propDefs = $vault.PropertyService.GetPropertyDefinitionsByEntityClassId("FLDR")

$csv = Import-Csv "c:\temp\folderProperties.csv" -Delimiter ';'
foreach($row in $csv)
{
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FolderName")
		{
			$propDef = $propDefs | Where-Object { $_.DispName -eq $col.Name }
			$data[$propDef.Id] = $col.Value
		}
	}
	$propValues = New-Object Autodesk.Connectivity.WebServices.PropInstParamArray
	$propValues.Items = New-Object Autodesk.Connectivity.WebServices.PropInstParam[] $data.Count
	$i = 0
	foreach($d in $data.GetEnumerator()) {
		$propValues.Items[$i] = New-Object Autodesk.Connectivity.WebServices.PropInstParam -Property @{PropDefId = $d.Key;Val = $d.Value}
		$i++
	}
	$folder = $folders | Where-Object { $_.Name -eq $row.FolderName }
	$vault.DocumentServiceExtensions.UpdateFolderProperties(@($folder.Id),@($propValues))
}

Here, like for the files, we search for all folders (you can enhance the search), load all the rows from the CSV and then perform the property update. Unfortunately, powerVault does not offer a Update-VaultFolder yet, so we have to do it via regular Vault API, which makes the script more cumbersome and longer.
In this case, we have to get all the properties first, as we cannot work with the property name. We need the Vault internal property ID. Then, we have to build the PropInstParamArray argument for the UpdateFolderProperties API function. So, we cycle through all the CSV columns, except for the FolderName, and build our list of property IDs and values.

The sample for the folder property update can be also applied to custom objects and change order. Obviously the function names and arguments will change, but the logic remains the same. Via vapiTrace, you can work in Vault and see which arguments and functions are needed for performing the search and the property update.

Bottom line, you have plenty of options for doing a mass update of your data. The coolOrange dataLoader is definitively the first choice when it comes to update file properties in a secure and reliable way. However, if the amount of data is small or you want to update other objects, then some line of scripting will do too.

We hope you enjoy!!

Posted in Migration, PowerShell, powerVault, Vault API | Leave a comment

Fixing the BOM blob

With Vault 2017, Autodesk introduced a new job type: autodesk.vault.extractbom.inventor. This job fixes the file BOM blob. The file BOM blob is an object stored in the Vault database on each CAD file record. It is created by the Vault CAD add-in, which saves the CAD BOM information on each check-in into Vault. When you perform an “Assign-Item”, the file BOM blob is used for creating the items and the according bill of material. If the file BOM blob is missing or corrupt, then the assign-item will just create the master item and either no BOM or a wrong BOM.

This is where the new job comes in. You can queue a job for the Inventor files (iam, ipt) with issues, and the job will recreate the file BOM blob object for such files. In order to update the BOM blob object in the Vault database, a new API have been introduced: SetBomByFileId. This API allows to update the file BOM blob, without creating a new version.

To be fair, the BOM blob is usually OK, so you don’t need this job on a day-by-day situation. However, if you imported data/files from a legacy system via BCP and you did not create the BOM blobs, then you will experience that the assign-items will not work, and you have to open the assemblies and components one by one and save them back into Vault in order to fix the problem.

The question now is how to queue this new job. You can use the LifecycleEventEditor and queue the job on lifecycle change, or with Vault 2018 you can set this up on the lifecycle transition settings. I’d like to show you how to queue the job via a VDS context menu or with a stand alone PowerShell script. This would give you the ability to queue the job for a selection of files or the complete list of Inventor files.

Let’s create a custom menu item via Vault Data Standard. We start by adding the menu entry in the MenuDefinitions.xml. Here is the menu item

<cOqueueExtractBomJob Label="Fix BOM" Description="Fix BOM" Hint="Fix BOM" PSFile="queueExtractBomJob.ps1" Image="coolOrange.ico" ToolbarPaintStyle="TextAndGlyph" NavigationTypes="File" MultiSelectEnabled="True" />

And here is how to edit the MenuDefinitions.xml:

The according PowerShell script, which goes to C:\ProgramData\Autodesk\Vault 2017\Extensions\DataStandard\Vault\addinVault\Menus\queueExtractBomJob.ps1 looks like this:

$files=$vaultContext.CurrentSelectionSet
foreach($file in $files)
{
  Add-VaultJob -Name "autodesk.vault.extractbom.inventor" -Description "Fix BOM for '$($file.Label)'" -Priority LOW -Parameters @{"EntityClassId"="File";"FileMasterId"=$file.Id}
}

As you can see, it takes the selection, and for each file it adds a job. Wait, what is Add-VaultJob??? It’s a new command-let that have been introduced in powerVault just few days ago. As the name says, it allows you to queue a job in a super simple way. Just pass the name of the job, the description, the priority and the parameters, and you are done! So, restart Vault, select some Inventor files, and trigger the function. You will see that jobs have been added to the queue.

OK, let’s now create the stand alone script. We will use powerVault again, for queuing a huge amount of jobs. We will perform a search via the Vault API and queue the job. The script looks like this:

Import-Module powerVault
Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""
$srchCond = New-Object Autodesk.Connectivity.Webservices.SrchCond
$srchCond.PropDefId = 33
$srchCond.PropTyp = "SingleProperty"
$srchCond.SrchOper = 1
$srchCond.SrchRule = "Must"
$srchCond.SrchTxt = "ipt"

$bookmark = ""
$srchStatus = New-Object Autodesk.Connectivity.Webservices.SrchStatus
$counter = 0
while($bookmark -eq "" -or $counter -lt $srchStatus.TotalHits)
{
  $files = $vault.DocumentService.FindFilesBySearchConditions(@($srchCond),$null,$null, $true, $true, [ref]$bookmark, [ref]$srchStatus)
  foreach($file in $files)
  {
    $job = Add-VaultJob -Name "autodesk.vault.extractbom.inventor" -Description "Fix BOM for '$($file.Name)'" -Priority LOW -Parameters @{"EntityClassId"="File";"FileMasterId"=$file.MasterId}
    $counter++
  }
}

It imports powerVault and connects to Vault. Adapt the connections information as needed. It then defines a search condition, where the file extension contains ipt. In my case, the PropDefId (which is the internal property definition ID) for File Extension, is 33. In your Vault, this might be different. In order to find the PropDefId, you might use the Vault API or, like I did, perform a search in Vault with your preferred criteria and use vapiTrace for spying the arguments, like here:

Back to the script, we then make a look and search over and over again for the files. The loop is necessary, as the search always returns just 100 records, so we have to cycle through this several times in order to hit all. Within the loop, we queue the job like before.

In the code sample above, we perform the search for IPTs. In order to fix all the BOM issues, you will have to queue the job for IAMs and IPTs. The BOM blob is quite complex, and it relies on the child components, so also the children must have a correct BOM blob.

So, now you can import your files in Vault, maybe via BCP, don’t care about the file BOM blob, and use the Autodesk job for fixing it. Via powerVault this becomes super simple.

 

Posted in Data Standard, PowerShell, powerVault, Vault BCP | Leave a comment

VDS Setup

Wouldn’t it be cool to create a setup for your own Data Standard customizations? Yes, a setup with a version, secure to install and easy to deploy, and simple to upgrade? It’s easier than you think. For this purpose, we use the open source WIX toolset. We need a batch-file like this

"c:\Program Files (x86)\WiX Toolset v3.11\bin\heat.exe" dir ..\VDS -cg VDSGroup -out VDSGroup.wxs -gg -srd -dr DataStandardFolder
"c:\Program Files (x86)\WiX Toolset v3.11\bin\candle.exe" VDSGroup.wxs
"c:\Program Files (x86)\WiX Toolset v3.11\bin\candle.exe" Customer1.wxs -dPVersion=1.0.1
"c:\Program Files (x86)\WiX Toolset v3.11\bin\light.exe" Customer1.wixobj VDSGroup.wixobj -out Customer1.msi -b ..\VDS
del VDSGroup.wxs
del *.wix*
pause

and a master XML instruction file like this:

<Wix xmlns='http://schemas.microsoft.com/wix/2006/wi'>
  <?define UpgradeCode="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
  <?define ComponentGUID="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
  <Product Name='Customer1 VDS Setup' Manufacturer='coolOrange' Id='*' UpgradeCode='$(var.UpgradeCode)' Version='$(var.PVersion)' Language='1033' Codepage='1252'>
    <Package Id='*' Description="Customer1 VDS Installer" Manufacturer='coolOrange' Keywords='Installer' InstallerVersion='100' Languages='1033' Compressed='yes' SummaryCodepage='1252'/>
    <Upgrade Id='$(var.UpgradeCode)'>
      <UpgradeVersion OnlyDetect='no' Property='PREVIOUSFOUND' Minimum='1.0.0' IncludeMinimum='yes' Maximum='$(var.PVersion)' IncludeMaximum='no'/>
    </Upgrade>
    <Media Id='1' Cabinet='Sample.cab' EmbedCab='yes' DiskPrompt='CD-ROM #1'/>
    <Property Id='DiskPrompt' Value="Customer VDS Installation [1]"/>
    <Directory Id='TARGETDIR' Name='SourceDir'>
      <Directory Id='CommonAppDataFolder' Name='ProgramData'>
        <Directory Id='AutodeskFolder' Name='Autodesk'>
          <Directory Id='VaultFolder' Name='Vault 2017'>
            <Directory Id='ExtensionFolder' Name='Extensions'>
              <Directory Id='DataStandardFolder' Name='DataStandard'>
                <Component Id='DataStandardFolderComponent' Guid='$(var.ComponentGUID)'>
                  <CreateFolder/>
                </Component>
              </Directory>
            </Directory>
          </Directory>
        </Directory>
      </Directory>
    </Directory>
    <Feature Id='Complete' Level='1'>
      <ComponentGroupRef Id='VDSGroup'/>
      <ComponentRef Id='DataStandardFolderComponent'/>
    </Feature>
    <InstallExecuteSequence>
      <RemoveExistingProducts Before="InstallInitialize"/>
    </InstallExecuteSequence>
  </Product>
</Wix>

Ok. Let’s make a step back. You diligently store your Data Standard customizations into a dedicated folder, right? That folder has the same structure as the Data Standard folder structure, something like this

Therefore, you could just take what you have in your clean customization folder, and compile it into a simple setup. For this purpose, we use 3 tools from the WIX toolset: heat.exe for gathering the folders/files information, candle.exe for compiling the information into an object (binary) file, and light.exe for linking the object files and the real files into a msi setup.

In my example, I have under Customer1, a Setup and the VDS folder. The VDS folder contains the sub-folders with the according VDS files. The setup folder, contains the setup files.

As you can see, we have the make.bat (see the first code block above) and the Customer1.wxs (see xml code above). Now, if you install the WIX toolset and create the same structure as in this example, you can run the make.bat, and you will get a customer1.msi that you can install.

OK, let’s add some more info here. Let’s start with the batch-file that we call make.bat. As you can see in the code, the batch file first collects all the folders and files within the VDS folder via the tool heat.exe

heat.exe" dir ..\VDS -cg VDSGroup -out VDSGroup.wxs -gg -srd -dr DataStandardFolder

The first argument dir should be clear. It’s where the heat.exe should start collecting information. In our case, it’s one folder up (..\) and then in the VDS folder. The second argument will give a kind of title to the collected information. We will call it VDSGroup. The result of the gathering shall be saved into the file VDSGroup.wxs. The last argument is a placeholder we will use later.

The next line is

candle.exe" VDSGroup.wxs

which compiles the information generated by the heat.exe in the line before. The next one is again a candle.exe, but with our master file, which we will talk about in a minute.

candle.exe" Customer1.wxs -dPVersion=1.0.1

What you see here, is that we pass the version for our setup as an argument. So, just increase the version, and you can update your previous setup. The pervious version will be removed and replaced with the new one.

Finally, we have

light.exe" Customer1.wixobj VDSGroup.wixobj -out Customer1.msi -b ..\VDS

which takes the compiled Customer1.wixobj and VDSGroup.wixobj files and link them together into a msi setup. The light.exe also takes the original files into the setup, therefore, we need to tell him where the files are (-b ..\VDS), like we did for the heat.exe in the first step.

Let’s now have a look to the Customer1.wxs, you have the XML code above. In this file, we have information about the setup name and the author, so, just go ahead and change the Customer1 with your customer or project name, and replace coolOrange with your company name. In the very first two lines, you’ll see

<?define UpgradeCode="041e09fb-6cde-431f-8955-d9a019cb8a35"?>
<?define ComponentGUID="041e09fb-6cde-431f-8955-d9a019cb8a35"?>

The sequence of number here is called GUID and identifies uniquely this setup for future updates. You need your own GUID, so please go to http://www.guidgenerator.com and generate a new GUID for your setup. DO NOT USE THESE GUIDs!!!!!!!! Have you seen all the exclamation marks?? DON’T USE THESE GUIDs! MAKE YOUR OWN, and replace the one in the file with yours. Get a new GUID for each new project-setup you will make.

I’m skipping most of the elements, as they are good this way, so let’s jump to the Directory elements.

<Directory Id='TARGETDIR' Name='SourceDir'>
  <Directory Id='CommonAppDataFolder' Name='ProgramData'>
    <Directory Id='AutodeskFolder' Name='Autodesk'>
      <Directory Id='VaultFolder' Name='Vault 2017'>
        <Directory Id='ExtensionFolder' Name='Extensions'>
          <Directory Id='DataStandardFolder' Name='DataStandard'>
            ......

This section will define the folder structure where the setup will install the files. Each Directory entry must have his unique ID. The second Directory element points to the ProgramData folder, and all the following Directory entries will be the physical sub-folders. Note the almost last Directory entry? It’s called DataStandardFolder. If you remember, in the batch-file, the last argument of the heat.exe is also DataStandardFolder. This way, the data collected by heat, will be added to our folder structure at this point. So, the sub-folder collected by heat.exe will become the sub-folders of our folder structure in the Customer1.wxs.

So, this Customer1.wxs is already quite good for covering the needs in order to create a VDS setup. Just check that your folder structure points to the appropriate Vault version (2016, 2017, 2018, etc.), set your GUIDs, change the description and you are good to go!

Every time you like to create a new version of the setup, just increase the version in the batch-file, and run the make.bat. You will have a new setup, with according version number. Make some tries. You will see how by installing a new version of the setup, old files will be removed and new files installed.

WIX is highly capable, so if you like to create more fancy setups, you can walk though the tutorial posted here. This blog post here covers just the basic, but you can go further. For example, we create client side setups that contains VDS files, powerEvents files, and other client side stuff. And we also create setups for the powerJobs, just applying the same logic. In the case where you have different sources that you like to combine into one setup, then you can have the heat.exe executed several times, compile the wxs files via candle, and then combine them all into the light.exe, by passing all the object files and adding all the source folders via the –b switch (-b source1 –b source2 –b source3). Good luck!

 

Posted in Data Standard | 2 Comments

Tracing the Vault API

Picasso ones said, „Good Artists Copy; Great Artists Steal”. You like to write some Vault API code, in .Net or PowerShell, and are fighting with the right sequence of commands and appropriate arguments? Why not click the sequence in the Vault client and copy the code generated by the Vault client?

The Vault client communicates via HTTP (Web Services – SOAP) with the Vault server. Tools like Fiddler can monitor the HTTP traffic, so that you can spy what the Vault client is talking to the Vault server and learn from that. Doug Redmond wrote years ago this blog. However, the SOAP communication is quite verbose, and because is not plain XML (it’s multipart), the Fiddler cannot show you the content in a nice, readable way.

Good news, Fiddler can be extended. We did this and created the vapiTrace (Vault API trace)

Just copy the vapiTrace extension to the C:\Program Files (x86)\Fiddler2\Inspectors folder. Start Filler, and you should see the vapiTrace tabs. Now, when you login into Vault (don’t use localhost), you’ll start seeing the client-server traffic and by clicking on the single communication lines, the vapiTrace tabs will show the request and the response in a simpler and readable way.

Let’s say you want to know how to create a new Vault folder via the API. Just go in Vault and create the folder manually and then pick in Fiddler the according command, and see which Vault API have been called, with which arguments. The best is that you can now delete your folder in Vault, and try to create it via API by using the same function with same arguments. Just keep in mind, that only arguments with values are transferred. So, it might be that your Vault API function requires 5 arguments, but in Fiddler you just see 3. This means that the two missing arguments will be null.

Now, Fiddler will trace any HTTP connection, therefore you will have a lot of noise. Activate the filter by going on the Filters tab, and set the filter that shows only URLs that contains “AutodeskDM”.

So, now you can experiment with the Vault API by simulating your action in Vault by hand first, and look into the Fiddler tracing how Vault is doing it, and you just have to do the same.

Posted in Vault API | 4 Comments

CAD BOM via powerVault

2017-03-03_09-05-19Some versions ago, the CAD BOM tab have been added to Vault Data Standard. Great stuff, as it gives you the ability to view the CAD BOM without the need of items. So, it’s great for Vault Workgroup customers and for Vault Professional. The tab reads the data from the so-called file BOM blob. It’s a property at the file that contains all the CAD BOM information. Such property is automatically filled when you check-in a CAD file. You can read the content of such property via the Vault API DocumentService.GetBOMByFileId. The object provided by this API is quite complex. It contains the structured BOM, the parts only, information about purchased part, phantom assemblies, external references, virtual components, and all the other beauty of a CAD BOM.

The implementation of the CAD BOM tab is just handling basic stuff. If the BOM contains data beyond simple components, the result is either wrong or empty. We spent quite some time over the past years interpreting the BOM blob object in Vault, and implemented the logic within powerVault in the Get-VaultFileBOM command-let. Such command-let gives you in a super simple way the correct structured CAD BOM. As powerVault is free, why not use it for showing a correct and complete CAD BOM in Vault?

With the Data Standard CAD BOM you’ll get two files, the FileBOM.ps1 (addinVault folder) and the CAD BOM.XAML (Configuration/File folder). To make it simple, we will replace the content of such files. For security reasons, it’s better you make a copy before. Best is if you copy the file to a separate folder in order to prevent conflicts with other PS1 files.

Let’s start with the PowerShell script FileBOM.ps1. The new content looks like this:

function GetFileBOM($fileID)
{
  $file = Get-VaultFile -FileId $fileID
  $bom = Get-VaultFileBOM -File $file._FullPath -GetChildrenBy ExactVersion #ExactVersion,LatestVersion,LatestReleasedVersion,LatestReleasedVersionOfRevision
  $dsWindow.FindName("bomList").DataContext = $bom
  return $bom
}

Pretty simple. With Get-VaultFile we get a complete file object. With Get-VaultFileBOM we get the BOM. The interesting thing about the Get-VaultFileBOM is the parameter GetChildrenBy, which allows you to define whether you like to see the BOM as it has been checked in the last time, or see how the BOM could be by using the latest version, latest released version or latest released from current version. More details about these options can be found here.

The XAML file CAD BOM.xaml is also pretty simple

<?xml version="1.0" encoding="utf-8"?>
<UserControl xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:Name="MainWindow" xmlns:WPF="clr-namespace:CreateObject.WPF;assembly=CreateObject">

<DataGrid Name="bomList" AutoGenerateColumns="False" IsReadOnly="True" ColumnWidth="Auto" HorizontalGridLinesBrush="WhiteSmoke" VerticalGridLinesBrush="WhiteSmoke">
  <DataGrid.Columns>
    <DataGridTemplateColumn>
      <DataGridTemplateColumn.CellTemplate>
        <DataTemplate>
          <Image Source="{Binding Thumbnail.Image}" Height="35"/>
        </DataTemplate>
      </DataGridTemplateColumn.CellTemplate>
    </DataGridTemplateColumn>
    <DataGridTextColumn Header="Position" Binding="{Binding Bom_PositionNumber}"/>
    <DataGridTextColumn Header="Part Number" Binding="{Binding _PartNumber}"/>
    <DataGridTextColumn Header="Quantity" Binding="{Binding Bom_Quantity}"/>
    <DataGridTextColumn Header="Unit" Binding="{Binding Bom_Unit}"/>
    <DataGridTextColumn Header="Material" Binding="{Binding Bom_Material}"/>
    <DataGridTextColumn Header="Type" Binding="{Binding Bom_Structure}"/>
    <DataGridTextColumn Header="State" Binding="{Binding _State}"/>
    <DataGridTextColumn Header="Revision" Binding="{Binding _Revision}"/>
    <DataGridTextColumn Header="Name" Binding="{Binding _Name}"/>
    <DataGridTextColumn Header="Title" Binding="{Binding _Title}"/>
    <DataGridTextColumn Header="Description" Binding="{Binding _Description}"/>
  </DataGrid.Columns>
</DataGrid>
</UserControl>

You can take the code above and respectively replace the code in the according files. Restart Vault, and enjoy!

I know, you want to sort the BOM. Well, this is a bit tricky, but is doable. The code in the FileBOM.ps1 must be tweaked a bit. There are the rows responsible for sorting

$dsWindow.FindName("bomList").Items.SortDescriptions.Clear()
$sort = New-Object System.ComponentModel.SortDescription("BOM_PositionNumber","Ascending")
$dsWindow.FindName("bomList").Items.SortDescriptions.Add($sort)
$dsWindow.FindName("bomList").Items.Refresh()

As you can see, in the second row you define the property over which you like to sort and the sort direction.

Oops, the BOM is sorted, but the sort order is alphabetical and not numeric. Well, the BOM_PositionNumber is a string, so let’s transform it into a number

$bom | ForEach-Object {
$_.BOM_PositionNumber = [int]$($_.BOM_PositionNumber)
}

Here the complete code:

function GetFileBOM($fileID)
{
  $file = Get-VaultFile -FileId $fileID
  $bom = Get-VaultFileBOM -File $file._FullPath -GetChildrenBy ExactVersion #ExactVersion,LatestVersion,LatestReleasedVersion,LatestReleasedVersionOfRevision
  $bom | ForEach-Object {
    $_.BOM_PositionNumber = [int]$($_.BOM_PositionNumber)
  }
  $dsWindow.FindName("bomList").DataContext = $bom
  $dsWindow.FindName("bomList").Items.SortDescriptions.Clear()
  $sort = New-Object System.ComponentModel.SortDescription("BOM_PositionNumber","Ascending")
  $dsWindow.FindName("bomList").Items.SortDescriptions.Add($sort)
  $dsWindow.FindName("bomList").Items.Refresh()
  return $bom
}

As you can see, we get the BOM, cycle through it and set the Position to be numeric, pass the BOM to the XAML dialog and then sort the property we like. That’s it.

If you want to add additional properties in the grid, just go ahead and change the XAML. If you like to see which other properties are available, set the AutoGenerateColumns to True. All the properties provided by the Get-VaultFileBOM will become visible. You can then pick the one you like, add them to your list and then set the AutoGenerateColumns back to False.

I hope you enjoy your new CAD BOM tab!

Posted in Data Standard, powerVault | Leave a comment

powerEvents

 

Happy new Year!!! We start the new year with a gift to you. A new product: powerEvents. The short story is that now all the Vault client events, such as lifecycle change, check-in, check-out, etc. can be easily consumed via a PowerShell script.

Why should you care? Let’s suppose you like to fill or empty some properties automatically on a lifecycle transition. Or want to set some properties, maybe taken from the parent folder, when a file is checked in. Or you may want to queue a job for just a specific type of file or category on lifecycle transition. Or you want to prevent a lifecycle transition when a combination of property values does not match. Or save item and BOM information into a file every time the item changes. Or perform some custom actions when a folder gets created. Or…. I think you get it.

The Vault client provides events for a lot of actions, such as check-in, -out, lifecycle change, etc. So far, in order to get advantage of such events, you’ve had to write .Net code with Visual Studio. With powerEvents, you can now just put some simple logic in a PowerShell script file and powerEvents will do the rest.

powerEvents is a Vault client extension, so the setup needs to be installed on each Vault client. After the installation, you’ll have under c:\ProgramData\coolOrange\powerEvents the PowerShell scripts. We already exposed all available events in the scripts, and grouped the events by object type. So, for instance, you have a file.ps1 which contains all the events related to files. If you have logic that might be useful across several actions, then you can place it into the common.psm1 file.

By default, powerEvents listen to all the events. In the script files you can see all the available events and start doing some tests. For each event, you have 3 functions, a GetRestriction, Pre and Post. The GetRestriction is the first function called, and allows you to prevent the execution of an event. Let’s say you don’t want to allow lifecycle transitions, then you can go into the file.ps1, scroll down to the GetFileStateChangeRestriction function, and set there a restriction, like this:

function GetFileStateChangeRestriction($FromFiles, $ToFiles, $Restrictions)
{
    $Restrictions.AddRestriction("my object","my custom restriction")
}

In this case, every attempt to perform any lifecycle transition with any file will be blocked and you’ll get the usual Vault dialog telling you which object (file, item, etc.) is affected by which restriction.

In case there are no restrictions, then the Pre function is executed. In case of the file lifecycle state change, the function is the PreFileStateChange. Here, you can perform actions before the lifecycle transition is performed. At this point you cannot stop the event any more, but you can do something with the object or other Vault objects.

At the end the Post function is called, in our case the PostFileStateChange. At this stage, the event is completed, and you can do something with the according object afterword. To be more specific, if you like to clean up some properties when you move from Release back to Work In Progress, then you have to apply you code in the Post function, as during the Pre, the file is still released and cannot be changed. If you like to set some properties on releasing a file, then you’ll have to do it during the Pre, while the file is still Work In Progress, and not during the Post, as at this stage the file is already released and blocked by the permissions.

Within these functions you have access to the Vault API via the usual $Vault variable, and you can make us of powerVault, which simplifies dealing with Vault. You can access to the complete Vault, so for instance, while you react on a file event, you can access to the folder or items, and do something, like creating a link, adding, modifying, etc. With a bit more work, you can also interact with the user via some dialogs and the like.

By leveraging powerGate, you can also interact with your ERP system. As an example, you could check during lifecycle change, whether the part number is filled and an according item exists in the ERP and is in the correct state. If not, you can interrupt the lifecycle and inform the user, that the expected pre-requisites are not given.

While you are playing with powerEvents, you can put in the so called DebugMode. This is a setting in the C:\ProgramData\Autodesk\Vault 2017\Extensions\powerEvents\powerEvents.dll.config file. If the DebugMode is set to True, then each change made in the PowerShell scripts has immediate effect. You don’t need to restart Vault. This allows you to test and tune the code, without constantly restarting Vault. Once you are done with testing and ready for production, set the DebugMode back to False. This has some significant performance improvements, as the scripts will not be read every time and the PowerShell runspace will also be reused, and not restarted with every event.

Also, every line of code has his cost. Therefore, you can decide which events you like to use and which not. Those that are not relevant to you, should be either commented or removed. At the next start of Vault, powerEvents will just register the events present in the script. By default, all the events are active, so you will have to manually comment or remove the unnecessary events.

powerEvents is now available as a Beta version and can be downloaded from here: http://www.coolorange.com/en/beta.php

Being a Beta version, it comes with no need for license, and with a limited life time that expires on August 1st. If you like the product and need pricing information, reach out to sales@coolorange.com.

We have powerEvents already in action at some customers and collected great feedback. However, before we officially launch this product in April with the other 2018 products, we like to get some more feedback from you. So, anything that comes up to your mind, feel free to send an email to support@coolorange.com with your questions, comments, suggestions.

We already love powerEvents, and so do the customers and resellers that supported us in the development of this tool. We are sure you will love it too and find many ways to improve your Vault workflows via powerEvents.

Posted in powerEvents | 4 Comments