Connecting Vault with Fusion Lifecycle

Screen Shot 2018-06-27 at 12.48.55

The short answer is YES, and later in this post we described how we connect Vault with Fusion Lifecycle. For the longer answer, there are some topics we need to cover. We believe that CAD data shall be managed locally in Vault, while enterprise-wide engineering processes shall be managed in Fusion Lifecycle. Presumably we could spend hours talking about where Vault ends and where Fusion Lifecycle starts, but we probably agree that CAD files shall be managed locally.

Both systems provide a clean programming interface. Vault has a SOAP (webservices) API, while Fusion Lifecycle has a REST API. Both are HTTP based languages. Fusion Lifecycle can be customized via Java Script, while Vault with .Net. Both Java Script and .Net can deal with SOAP and REST. So, could the two systems be connected directly? Basically yes, but there are a some “but”.

The first hurdle, is to bring the two systems together. Vault is on the local network, behind the firewall, while Fusion Lifecycle sits in the cloud. The connection from the local network to the cloud is simple. The port 80 (443 for HTTPS) is usually open for the internet connection. If you can open the Fusion Lifecycle page with your web browser, it’s all good. For the way back, so from Fusion Lifecyle to Vault, the access to the Vault Server (or IIS) must be open. Usually a port mapping is needed on the firewall. This is technically simple but involves risks. The entire Vault API is available on the internet.

That Vault API is very powerful and comprehensive. Fusion Lifecycle would have to consume the complex Vault API and considers Vault’s business logic, even for simple operations. In addition, the Vault API gets improved/enhanced with each Vault version, and is therefore subject to changes. There is some compatibility, but still. The Fusion Lifecycle API is easier, but you still have to consider the business logic. Fusion Lifecycle currently has 3 APIs: V1 (official), V2 (deprecated), and V3 (technical preview).

In addition, many CAD offices have restricted or no Internet access. Keep in mind, that Fusion Lifecycle may talk directly to the Vault server (sort of server-to-server), while for connecting Vault workflows to Fusion Lifecycle, it’s the Vault client that must talk to Fusion Lifecycle (sort of client-to-server). So, each Vault client would need internet access to Fusion Lifecycle.

For these reasons, we take a different approach. We put powerGate server in the middle, between the two systems. powerGate can be extended with plugins. We already have a Vault plugin that exposes the Vault API as a simple REST API. This API is way simpler and Vault version independent.

New is a powerGate server Fusion Lifecycle Plugin, which makes it simple to talk to  Fusion Lifecycle. So, now we have two simplified APIs, version independent, which allows a bidirectional communication. The powerGate server sits on the local network, and thus reachable by all vault clients. So, Vault clients do not need internet access. The powerGate server can be installed on any server and the preferred port (not 80 or similar) will be exposed to the outside. Only the powerGate server is now reachable from outside. Now, both Vault and Fusion Lifecycle can talk to each other via the powerGate server, without knowing anything from each other.

The current powerGate server plugin for Fusion Lifecycle can query the workspaces, display all items (elements) of a given workspace, read details of a specific item and create new items. It is also possible to upload files and attach them to an item. The Fusion Lifecycle API can do more and we will enhance the plugin in the coming weeks and months, but as of now, it already can to a lot. Since it is powerGate, the REST services can be consumed from PowerShell and .Net. This allows creating very cool workflows in Vault Data Standard, powerEvents and powerJobs. For example, it is possible to check during a Vault release cycle, whether the according Fusion Lifecyle item is released too. Or submit a job that creates a PDF file and loads it as an attachment to the given Fusion Lifecycle item. Or a change process started in Fusion Lifecycle creates a matching Change Order in Vault and links the appropriate Vault files and items.

Here’s an example of how the workspaces can be queried in PowerShell

$workspaces = Get-ERPObjects -EntitySet "FlWorkspaces"
And here you can retrieve all elements to a workspace
$items = Get-ERPObjects -EntitySet "FlItems" -Filter "WorkspaceId eq 93"

And so you can get information of an element

$item = Get-ERPObject -EntitySet "FlItems" -Keys @{WorkspaceId=93;Id=7660} -Expand @('Properties','Relations','Attachments')

As you can see in this last example, the properties, the links, and the attachments are also retrieved. If you are familiar with the Fusion Lifecycle API, then you know that the attachments are a separate API call. But we do not want to deal with such complexity on the Vault Client side. We simply want the desired data. The powerGate Server Plugin should take care of handling this and other complexities.

In order to create a Fusion Lifecycle item, the mandatory and/or editable properties must be set. Here’s an example

$properties = @()
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='NUMBER';Value='01-000-0001'}
$properties += New-ERPObject -EntityType "FusionLifecycleItemProperty" -Properties @{Name='TITLE';Value='coolOrange powerGate test'}
$newItem = New-ERPObject -EntityType "FusionLifecycleItem" -Properties @{WorkspaceId=8;Properties=$properties}
Add-ERPObject -EntitySet "FlItems" -Properties $newItem

Files can be uploaded with this line

Add-ERPMedia -EntitySet "FlFile" -File C:\temp\test.pdf -Properties @{WorkspaceId=93;Id=7660;FileName="test.pdf";Description="test upload from powerGate"}

As you can see, the path to the local file, the ID of the workspace and the item to which the file is to be uploaded are specified. The complexity of making a matching HTTP call, using cookies, content-type, converting the file to appropriate bytes, etc. is handled by the powerGate server plugin.

A sample PowerShell Script, the powerGate plugin and the source code can be downloaded here https://github.com/coolOrange-Public/powerGateFusionLifecycle from GitHub. We hope you enjoy this new powerGate extension. We will post more about Vault-Fusion Lifecycle in the future.

Posted in Fusion Lifecycle, Vault-ERP connection | Leave a comment

Vote for AU Las Vegas classes

Whether you’re going to AU Las Vegas or not, you can and should vote for the classes. Here is the link http://au.autodesk.com/speaker-resource-center/call-for-proposals/voting. As you know, we are Data Management guys, so we would love if you could especially give a look to the Data Management and Product Lifecycle Management topics, but also filter by Vault or Fusion Lifecycle. Most of the content will be made available after AU, so voting for your preferred classes is worth also if you will not attend.

Despite the general invite to vote for classes, we are obviously advertising our own cause as well. So, in case you like to have coolOrange people on stage, then vote up our classes. This year, we submitted several classes on several topics, and here is the list:

  • What’s New in the Vault API with examples you can use
  • Custom Reporting in Vault 2019 – Dress up your Vault data to meet the world!
  • Connect Vault with ERP – items, bill of materials, files and more…
  • Connect Vault with Fusion Lifecycle – expand your engineering workflows
  • Enhance Vault workflows, with Vault client events
  • How to load data into your Vault – Data Transfer Utility

The first session is for coders, while the further sessions will have interesting insights for coders as well but will also provide samples and insights for CAD admin, IT admins and Vault resellers of course.

So, if you like these sessions, then please vote them up, but also give a look to the other available sessions and express your preference. The voting will close on July 13th, so you still have some weeks to go, but the best would be if you take 10 min. now, while you read this blog post.

Thanks for your engagement, and we will hopefully see you in Las Vegas!!

Posted in Uncategorized | Leave a comment

Datastandard – Cool custom dialog from menu

Let’s create your own custom dialog without any help of Datastandard and still with PowerShell!Webp.net-gifmaker (3)

 

 

The issue with dialogs from Datastandard

I am sure you have faced a very similar situation like I did:
My initial goal was to create a very simple dialog for just showing some data when the users clicks on my context menu entry.

But if you are using the dialogs from Datastandard they come with some logic, depending on the method you call, some examples:

  1. GetCreateDialog($folderId): It creates always a new file and you have to specify a Template
  2. GetEditFolderDialog($folderId): It creates a new folder or updates the properties

The other dialogs also have similar features, but what if you don’t need any of those features?

 

The coolDialog as solution

1. Create your XAML

First, we have to provide a XAML file (What is that?) with our needs which we can later load in PowerShell.
The root must be the Window element and here is an example of “The coolDialog”.

2. Load your XAML in PowerShell

If you have now a valid XAML file with the <Window> as root, then you can load it with one fo the overloads of the static XamlReader.Load methods, like this snippet:

[xml]$xamlContent = Get-Content "$($env:ProgramData)\Autodesk\Vault 2018\Extensions\DataStandard\Vault.Custom\addinVault\Menus\TheCoolDialog.xaml"
$theCoolWindow = [Windows.Markup.XamlReader]::Load((New-Object System.Xml.XmlNodeReader -ArgumentList @($xamlContent)))

If you execute this snippet it will NOT show the window! For accomplishing that you need to call ShowDialog().
After you copy this code you need to replace the path to your XAML file!

3. Fill the dialog with data

Of course, you want to fill your dialog with some dynamic data. In my example of the coolDialog there is a TextBox where the Text is bound to the property FullName: This means the TextBox expects to have access to an object with the Property FullName

After the first 2 steps the dialog has no data, therefore the Text will be empty.

Now, we give an object, in my case a Vault Folder to the DataContext of the parent control:

    $folderId = $vaultContext.CurrentSelectionSet | select -First 1 -ExpandProperty "Id"
    if(-not $folderId) {
        throw "No valid folder selected!"
    }
    $fldr = $vault.DocumentService.GetFolderById($folderId)
    $folder = New-Object Autodesk.DataManagement.Client.Framework.Vault.Currency.Entities.Folder -ArgumentList @($vaultConnection, $fldr)
    $theCoolWindow.FindName("StackPanelCoolView").DataContext = $folder

The folder has a property FullName and therefore that value will be shown in the TextBox.

4. Handle events

I am pretty sure you want to have something happen when the customer clicks a button, therefore we look to handle events now.

In PowerShell, you can subscribe to every available event of a control by calling the method add_<EventName>($scriptBlock).

Let’s extend our coolDialog so it will close when clicking on the Close Button:

    $theCoolWindow.FindName("BtnCancel").add_Click({
        $theCoolWindow.Close()
    })

Tip: If you want to see all available events for a certain control, like for a Button just google “wpf button” and goto the related msdn page.

5. Bring your window to life

The last step is to let the window appear as a Graphical User Interface visible to the customer.
This can be easily accomplished by calling ShowDialog():

$theCoolWindow.ShowDialog() | Out-Null

Why do I pipe it through Out-Null you ask? Because ShowDialog returns a result and I don’t handle it. If you want to handle depending on how the user exited the dialog read these Remarks.

The full version of the PowerShell script for the coolDialog you can download here.
But remember this example was programmed for a menu entry in Datastandard for Vault 2018.2
You will find all downloads at the end of this post!


Troubleshooting

Props[], PSCmd[], PSVal.. do not work in my XAML

All of Datastandard features what you usually use to bind in your Dialog are NOT working in a custom dialog. Here you can see all the Binding which are affected!

 

Events not working (Vault crash?)

I could identify why, but sometimes the scriptBlock of the event has no access to the variables outside of that scope.
The solution is to mark the variables as global which are initialized out of the event and are also used within the Event:

$global:theCoolWindow = [Windows.Markup.XamlReader]::Load((New-Object System.Xml.XmlNodeReader -ArgumentList @($xamlContent)))
$theCoolWindow.FindName("BtnCancel").add_Click({
        $global:theCoolWindow.Close()
    })

It depends on your environment but this could be also a cause that your Application crashed.

ShowDialog() vs Show()

 

Do not call Show(), it will work in the first place by showing your window. BUT every $scriptBlock triggered by an event will throw an exception and will sometimes crash your Application.
Be sure to use the ShowDialog() method!


 

Whatever, I congratulate you to your custom dialog where you have the total freedom of adding controls and the logic behind!

 

Happy developing,

Patrick

 

Downloads

This example was programmed for a menu entry in Datastandard for Vault 2018.2

Posted in Data Standard | Leave a comment

Load – Enhance – Connect

Over the past years, we developed products and technologies that made your Autodesk Data Management projects even more successful. We touched different topics and build products, technologies and services around it. The result of our work can be summarized in the three themes Load – Enhance – Connect, and these three themes become the core message of our new web site http://www.coolorange.com. Have a look and let us know what you think.

Let’s talk about Load: Vault comes with a cool little tool called DTU (Data Transfer Utility, aka VaultBCP), which allows to import data into Vault in bulks. DTU can be used to import small and large data sets, with files, history, items, BOMs, links, etc. It can be used to import files from file system, to migrate from other Data Management systems, to merge Vaults, and the like. We’ve created a tool kit that makes it simple to generate, manipulate, merge and validate BCP packages, so that any sort of import situation can be handled in a reliable and predictable way. The bcpToolkit can do already a lot and will grow even more later this year. Additionally, to the tools, we also put our experience at your service, so, if you are facing any Vault import, migration or merge challenge, then reach out to us.

Under Enhance we have combined powerJobs and powerEvents. While powerJobs enhances your Vault Jobprocessor with ready to use jobs and the ability to create your own jobs, powerEvents enhances your Vault client by giving you the ability to improve your workflows. Both products will enhance your processes and make Vault behave the way that makes sense to you. As an example, you can say “release only if …” and define in powerEvents the logic you are looking for. Both, powerJobs and powerEvents can be configured via Microsoft PowerShell scripting language, so little coding experience is way enough for bringing Vault to the next level.

Connecting Vault with other systems is the next logical step. Once you have your data and processes under control within your engineering department, it’s time to connect with the rest of the company. The connection to ERP is the obvious step, and with powerGate many ERP systems from SAP, over Microsoft to also smaller ERP Systems have been connected. Meanwhile connecting to cloud is also a topic, which can be addressed with powerGate as well.

The new website talks more about your situations and less about product features, so we hope that you’ll get a better understanding on what the products can do for you. We are quite excited about the new website, and hope you will enjoy it too.

 

 

Posted in Uncategorized | Leave a comment

#19 version available

Over the past couple of months we have been a bit quiet, and I’d like to apologise for that. We have been busy working on several topics, which we will present in the next weeks and months.

Today we proudly announce the release of the 2019 products. Actually, as you can see from the banner picture, we call it just 19. As for the past versions, with the 19 products, we support the Autodesk 2019, 2018 and 2017 versions. So, the latest and the two preceding versions. This means you have for example powerJobs 19 for Vault 2019, 2018 and 2017. Regardless on which version of Vault you are, you get the most recent coolOrange product version with all new features and enhancements.

The complete product line have been released, including powerJobs, powerVault, powerGate, dataLoader, vaultRuler, mightyBrowser and all other apps we have. Only exception is the bcpToolkit. We are working on supporting the BCP 2019 format and will release our products around DTU (Data Transfer Utility) very soon. There will be some exciting news there as well, so stay tuned.

In order to simplify and speed up the download process, we moved our products to Amazon S3 and created a dedicated domain. So, if you go on http://download.coolorange.com, you will find all the download links. There you will also find the archive of previous versions, so if you are running an oder version of the Autodesk products and still need the according coolOrange product, you can find it there. As mentioned, we officially support the current release and the two preceding versions, but we understand that if you did not upgraded yet and still need to setup your machine, you still need access to legacy versions. Be aware, that those versions are not supported, however, they worked well so far, so you may take the risk.

So, what’s new? Despite the support for the latest Autodesk applications, powerJobs now officially supports the newly introduced InventorServer. This does not require an Inventor license, if used in combination with Vault. So, now you can create PDFs, DXF, DWG and other formats via powerJobs, without the need of an Inventor license.
Also, we made some important internal architectural changes, which may not be that visible to you, but are important for the ongoing technology changes. For instance, we moved to PowerShell 4 as a minimum requirement, as there are several important benefits that the products, and finally you, can take advantage from.
One relevant change for you, is that all our PowerShell based products do no longer require a special shortcut for starting the ISE (script editor). You can start whichever script editor you like, PowerShell ISE, Visual Studio Code, or the like, and just load the according PowerShell module via Import-Module.
There are more changes, which you can read on our wiki page via http://www.coolorange.com/wiki, in the according product change log section. We will talk about more enhancements in the coming weeks and months.

We wish you fun and success with the coolOrange 19 products, regardless which Autodesk product version you run.

Posted in Uncategorized | Leave a comment

Couple of updates

Last week, we released new versions of powerVault, powerJobs and powerEvents. While the changes on powerVault are smaller bug fixes, and here you can find the details, for powerJobs there is an interesting improvement related to the Inventor project file.

So far, powerJobs always sets the project file for each job, even though the project file is already set. This is not an issue, until you have a situation where the Inventor session used by powerJobs, has an open file. As you know, setting a project file while there is a file open in Inventor, it’s not possible. Therefore, powerJobs fails to execute the job, because the attempt to set the project file fails. This is now solved. If the project file of the Inventor sessions is already the right one, then there is no need to change that. Also, if the project file must be changed and there is a file open, then the job will fail with a human readable error message.

In order to prevent this problem, we suggest tweaking your logic in the script. Make sure that the open, convert (or whatever you do with the file) and close operations are very close together. If you need for example to copy the PDF file to a network share, then do it after the file close. This way, if for which ever reason the PDF copy to network fails, then the original file has been already close. If you do the copy operation before closing the original file, in the case where the copy operation fails, your job will fail and the close would not be executed. You end up having an open file in your Inventor session. So, make sure that your job will always close the file before you do other risky actions.

We also released a new version of powerEvents. This version now fully supports Change Orders and Custom objects, but even more important, it recognizes changes on the scripts on the fly. Whenever you add, remove or change a file in the powerEvents folders, all the registered events will be removed and re-added, even while your Vault client is open. This way, you can make changes as you want, and test your changes immediately without the need to restart Vault. Also, you can now unregister events programmatically. So, you may sign up for an event just temporary and then unregister from the event. There are now many scenarios that can be accomplished.

powerJobs and powerEvents are embedding the new version of powerVault. By updating powerJobs or powerEvents, you will automatically get the latest version of powerVault. In case you have the need just for powerVault, then of course you can update just that.

Posted in powerEvents, powerJobs, powerVault | Leave a comment

Complete your Vault data

2017-07-14_09-40-17This topic came up quite frequently in the last months. You have your data in Vault, but for which ever reason, there is still other data outside of Vault that shall be added to existing Vault files, folders, items, or other objects. You have the data as CSV (or Excel), but don’t know how to import this into Vault. Well, for the files you can use our dataLoader. It’s a simple and powerful tool that allows you to pick a CSV file, it does the mapping of properties and files, and then updates the file properties. It also keeps track of issues during the update process, so that you can repeat the operation for the problematic files to a later point in time.

While the dataLoader gives you comfort, performance, and other benefits, there might be situations where you need to update just a small amount of files or other objects like folders or items.

As you know we love scripting, so updating properties based on a CSV file is something that can be done via a PowerShell script. As an example, if you want to update file properties, you can take advantage of powerVault (free) and have a script like this:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$files = @()
do {
	$files += $vault.DocumentService.FindFileFoldersBySearchConditions($srcConds, $null, $null, $true, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($files.Count -lt $searchStatus.TotalHits)

$csv = Import-Csv "c:\temp\fileProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating files..." -CurrentOperation $row.FileName -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FileName")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$file = $files | Where-Object { $_.File.Name -eq $row.FileName }
	$fullPath = $file.Folder.FullName + '/' +$file.File.Name
	$file = Update-VaultFile -File $fullPath -Properties $data
}

This is a sample CSV file for this script:

FileName;Title
100083.ipt;updated via script
100084.ipt;updated via script

This script searches for all the files in Vault, reads the CSV file, and for each line in the CSV, it updates the according properties. This script is kept pretty simple, so for instance loading all the files might not be a good idea. If you know that you just need Inventor files, then it would be good to enhance the filter criteria in the search, or if you just look for files in a given folder, then you can specify this in the FindFileFoldersBySearchConditions function. If you like to know which search criteria to use, then you can easily perform a search in Vault and use vapiTrace to figure out the parameters. We need to search for the files, as we need the full path in order to work with the Update-VaultFile command-let. However, if you have the full path in your CSV, than searching the files can be omitted.

Also, this script does not have any sort of error handling, so in case the file could not be found, or the file is checked-out or permissions are missing, etc., the update will just fail. Obviously the script can be enhanced in order to take care also of these situations.

The same applies to items. Here is the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$csv = Import-Csv "c:\temp\itemProperties.csv" -Delimiter ';'
$i = 0
foreach($row in $csv)
{
	Write-Progress -Activity "Updating items..." -CurrentOperation $row.ItemNumber -PercentComplete ($i++ / $csv.Count * 100)
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "ItemNumber")
		{
			$data[$col.Name] = $col.Value
		}
	}
	$item = Update-VaultItem -Number $row.ItemNumber -Properties $data
}

In this case, it’s even simpler. We don’t need to search for the items. If you know the number, then you can update them right away. There are some things to take into account. You can just update user defined properties via the argument –Properties. The title for instance, is a system property, which can be updated via the special argument -Title.

Another example, is updating the properties of folders. Here the script:

Open-VaultConnection -Server "localhost" -Vault "Vault" -User "Administrator" -Password ""

$srcConds = New-Object Autodesk.Connectivity.WebServices.SrchCond[] 1
$srcConds[0] = New-Object Autodesk.Connectivity.WebServices.SrchCond
$srcConds[0].PropDefId = 0
$srcConds[0].PropTyp = "AllProperties"
$srcConds[0].SrchOper = 1
$srcConds[0].SrchTxt = ''
$srcConds[0].SrchRule = "Must"
$bookmark = ""
$searchStatus = New-Object Autodesk.Connectivity.WebServices.SrchStatus
$folders = @()
do {
	$folders += $vault.DocumentService.FindFoldersBySearchConditions($srcConds, $null, $null, $true, [ref]$bookmark, [ref]$searchStatus)
} while ($folders.Count -lt $searchStatus.TotalHits)

$propDefs = $vault.PropertyService.GetPropertyDefinitionsByEntityClassId("FLDR")

$csv = Import-Csv "c:\temp\folderProperties.csv" -Delimiter ';'
foreach($row in $csv)
{
	$data = @{}
	foreach($col in $row.PSObject.Properties)
	{
		if($col.Name -ne "FolderName")
		{
			$propDef = $propDefs | Where-Object { $_.DispName -eq $col.Name }
			$data[$propDef.Id] = $col.Value
		}
	}
	$propValues = New-Object Autodesk.Connectivity.WebServices.PropInstParamArray
	$propValues.Items = New-Object Autodesk.Connectivity.WebServices.PropInstParam[] $data.Count
	$i = 0
	foreach($d in $data.GetEnumerator()) {
		$propValues.Items[$i] = New-Object Autodesk.Connectivity.WebServices.PropInstParam -Property @{PropDefId = $d.Key;Val = $d.Value}
		$i++
	}
	$folder = $folders | Where-Object { $_.Name -eq $row.FolderName }
	$vault.DocumentServiceExtensions.UpdateFolderProperties(@($folder.Id),@($propValues))
}

Here, like for the files, we search for all folders (you can enhance the search), load all the rows from the CSV and then perform the property update. Unfortunately, powerVault does not offer a Update-VaultFolder yet, so we have to do it via regular Vault API, which makes the script more cumbersome and longer.
In this case, we have to get all the properties first, as we cannot work with the property name. We need the Vault internal property ID. Then, we have to build the PropInstParamArray argument for the UpdateFolderProperties API function. So, we cycle through all the CSV columns, except for the FolderName, and build our list of property IDs and values.

The sample for the folder property update can be also applied to custom objects and change order. Obviously the function names and arguments will change, but the logic remains the same. Via vapiTrace, you can work in Vault and see which arguments and functions are needed for performing the search and the property update.

Bottom line, you have plenty of options for doing a mass update of your data. The coolOrange dataLoader is definitively the first choice when it comes to update file properties in a secure and reliable way. However, if the amount of data is small or you want to update other objects, then some line of scripting will do too.

We hope you enjoy!!

Posted in Migration, PowerShell, powerVault, Vault API | Leave a comment