Create a folder tree from a template

blogCreateFolderFromTemplate

In the past few weeks, many of you asked this question: “how to create a folder in Vault by copying the folder tree from another template folder?”. In this post, I’m showing you how to do this with Vault Data Standard.

There are basically 2 steps to do. The first is to enhance the standard folder dialog by adding a combobox where you can select a template folder. The second is to copy the complete folder tree from the template folder to the new generated target folder.

Let’s start with the dialog. You have to add an additional row to your dialog,  a label and a combobox. This is how it looks on my side

<Label Content="Template" Grid.Column="0" Grid.Row="4" />
<ComboBox ItemsSource="{Binding PsList[GetListOfVaultProjects]}" SelectedValue="{Binding Prop[Template].Value}" Grid.Column="1" Grid.Row="4" />

As you can see, I bind the ItemsSource of the combobox to a PowerShell function called GetListOfVaultProjects, and the SelectedValue to a Vault property called Template. This means that we have to create a Vault property named Template and map such property to the Folder categories. We also have to create the PowerShell function with the name GetListOfVaultProjects. For this purpose, I just created a new PowerShell file and placed it into the addinVault folder of Data Standard. This is how my function looks like

function GetListOfVaultProjects()
{
     $dsDiag.Trace(">> GetListOfVaultProjects")
     $designsFolder = $vault.DocumentService.GetFolderByPath("$/Designs")
     $subFolders = $vault.DocumentService.GetFoldersByParentId($designsFolder.Id,$false)
     $projectFolders = $subFolders | <strong>Where-Object</strong> { $_.Cat.CatName -eq "Project" }
     $listOfProjects = $projectFolders | <strong>ForEach-Object</strong> { $_.FullName }
     $dsDiag.Trace("<< GetListOfVaultProjects")
     return @($listOfProjects)
}

As you can see, the function looks to all Folders underneath $/Designs and picks only those that are of category “Folder”. If you like to have your template somewhere else, then just adjust the folder, and in case, also the category. You may wonder why the return value is embedded in a @(…) in the last line. The reason is that in case you only have one template, PowerShell reduces the array (list of values) down to one single value. But we need a list of values, worst case either an empty list or a list with at least one element. As in PowerShell an array is create with @(), by embedding the variable into @(), we ensure that in case there is only one or no values, an array is still returned.

Anyway, the next step is to enhance the PowerShell script that creates the Folder (addinVault/Menus/CreateFolder.ps1). In this script, we will recursively recreate the same folder structure as defined in the selected template folder. As we don’t know how deep or articulated the folder structure might be, we create a little recursive function that goes into the tree and recreates step by step the folder tree for the target folder. This is the function

function recursivelyCreateFolders($targetFolder, $sourceFolder)
{
     $sourceSubFolders = $vault.DocumentService.GetFoldersByParentId($sourceFolder.Id,$false)
     foreach ($folder in $sourceSubFolders) {
          $newTargetSubFolder = $vault.DocumentServiceExtensions.AddFolderWithCategory($folder.Name, $targetFolder.Id, $folder.IsLibrary, $folder.Cat.CatId)
          recursivelyCreateFolders <em>-targetFolder</em> $newTargetSubFolder <em>-sourceFolder</em> $folder
     }
}

We basically pass the source and the target folder and let the function go into the child folder of the source and recreate them at the target. For each folder found in the source, the function calls them with the according child source and target folder. So, the function will go down the tree itself. It’s important that this function is written at the top of the script file, as script files are read from top to bottom. So, when we later call the function, the function must already be known.

Now we can call the function right after the new folder has been created, like this:

if($result)
{
      #new folder can be found in $dialog.CurrentFolder
      $folder = $vault.DocumentService.GetFolderById($folderId)
      $path=$folder.FullName+"/"+$dialog.CurrentFolder.Name

#create template folder tree
      $newFolder = $vault.DocumentService.GetFolderByPath($path)
      $template = $dialog.ViewModel.Prop["Template"].Value
      if($template -ne "")
      {
           $templateFolder = $vault.DocumentService.GetFolderByPath($template)
           recursivelyCreateFolders <em style="font-size: 16px; color: #444444; font-family: Georgia, 'Bitstream Charter', serif; line-height: 1.5;">-sourceFolder</em> $templateFolder <em style="font-size: 16px; color: #444444; font-family: Georgia, 'Bitstream Charter', serif; line-height: 1.5;">-targetFolder</em> $newFolder
      }
...
..
...

And this is how it looks once it’s finished

createFolderFromTemplate

As you can see, when you now create a new folder, you can pick from a list of templates and no matter how the structure in the selected template looks like, the new folder will have the same children.

Of course this sample can be extended, for instance by applying some properties of the new folder to the child folders. Or by copying also some files from the source to the target. Or by renaming the child folder with a prefix from the originator folder. Or… you get the idea. In this sample, the template folders are within Vault, but they can also come from an external source, such as your ERP or similar. So, the possibilities are endless. The logic remains almost the same. Yes, there is a bit of code to write, that is obvious as somewhere we must teach Vault to do what we like, but I’m still delighted by the shortness of the code.

I hope you enjoyed this sample and comments are welcome as always!

Posted in Vault API, PowerShell, Data Standard | Tagged , , , | 1 Comment

Ready for 2015 R2

2015R2b

Hi everyone, as of today the coolOrange products are ready for Vault 2015 R2 and can be downloaded from the website. All the cool features are now available for Vault 2015 R2 at no additional cost. So if you already have a 2015 license, you can activate the according product also for the 2015 R2 Vault release. If you don’t have a license, well, then it’s time to get one ;-)

As Vault 2015 R2 has some visible enhancements, we had to adapt our products accordingly. So when you download your preferred product from our website, you will find links for 2015 and 2015 R2. Make sure you pick the right one. This is valid for powerJobs, vaultRuler, dataLoader. The bcpChecker and the bcpDevkit are an exception, as they work with all BCP versions from 2012 onward, and now also with 2015 R2.

Have fun working with the R2 products!

Posted in Uncategorized | Tagged , , , , , | Leave a comment

New version of bcpChecker available

bcpChecker

We recently released a new version of the bcpChecker. The bcpChecker allows you to open a Vault BCP package and navigate through the content in a Vault-similar UI. This saves you a ton of time and headaches when you are working on a Vault migration project.

The new version let’s you navigate also through items, BOMs and links, additionally to folders and files. This way, you can check the quality and structure of your BCP package before you start a 10+ hours import into Vault. Thus you can check for correct categories, lifecycle, states, properties, folder structure, file naming, etc. You basically see the outcome of the import before you start the import.

bcpChecker

In the current release you can also check if a file exists for all entries in you BCP package, and so avoid annoying errors during the import. Also, you can get a report of all used behaviors and check against your Vault whether you really have defined all the properties, life-cycles, categories, etc. that are required for a correct import of the BCP package.

Beyond the visual checks you could also perform some queries and verify that all references can be resolved, all items have appropriate files, etc. Or you may correct some smaller issues by manipulating the BCP package and re-export the corrected version.

So, in case you plan to perform a migration to Vault using the Vault BCP, which we can only suggest, take a look at the bcpChecker. It will save you stress and will help even your customer to see the actual end-result before the import really starts.

Posted in Migration, Vault BCP | Tagged , , , , , | Leave a comment

Rule your Vault

vaultRuler-Banner

We just released a new version of the vaultRuler. A powerful tool that applies category rules to all the files in your Vault. The new version additionally allows you to force the lifecycle definition and property assignment for those files that are already in the right category. And you can also reinforce the rules for all files in your Vault.

You may have just moved from Vanilla Vault to Vault Workgroup or Professional, have a huge amount of files, all in category Base, and want to bring them into the appropriate category. Or you have already Vault Workgroup or Professional, you changed some behaviors and want to bring existing files into the right shape. In both cases the vaultRuler will be a huge help.

vaultRuler-animation

This new version allows you to apply the lifecycle definition and property assignment also to those files that are already in the right category. Or, you may apply the rules to all files, regardless of the category.

So, let’s suppose you’ve worked with Vault for a while and you realize you need a few more properties. No problem. Just add the properties in the Vault configuration, assign them to the category you think is appropriate and all new files will have such properties. But what happens to the existing files? Well, the properties will be added only if you eventually change their value.

The alternative is to use the vaultRuler. It goes over all of your files and brings them again into the defined rule.

The applications for vaultRuler are manifold. The use is always the same. Configure the Vault as it best suits you, and then let the vaultRuler bring all your files into the right settings. The different options of the vaultRuler will allow you to selectively apply the configured rules. Download the vaultRuler now and see for yourself.

Posted in Migration | Leave a comment

Tracing the Vault API

11

We talked about tracing the Vault API in several blog posts, and Doug Redmond also wrote a great post on how to use Fiddler in order to trace live which API calls Vault is currently doing. We spent some time on getting more familiar with Fiddler and came across the Fiddler Core. It’s a .Net assembly you can use in your Visual Studio projects and filter the HTTP/HTTPS traffic the way you want. The outcome of our investigation is vapiTrace.

vapiTrace (Vault API trace), is a little application that listens to the HTTP traffic, like Fiddler, and filters just the Vault calls, removing all the other HTTP noise. It picks the called API command with according arguments and response. The result is shown in a simple interface.

Just start vapiTrace, click around in your Vault, and you will see which API functions are called with which arguments and what the response from the Vault server is.

This will help you understand which commands are needed for a specific operation, in which sequence the commands should be called, how the arguments shall be filled, and more. It’s a great simple tool that helps you getting more familiar with the Vault API.

Here is an example. Let’s say you like to create a folder and link some files to your folder. So, start Vault and position to any folder. Start the vapiTrace, position to the parent folder where you like to create your new folder, create the folder in Vault and then drag&drop your file into the new folder.

vapiTrace

You will notice that the vapiTrace traces a lot of different commands. This is normal, as Vault is doing several things. If you look to the command list, you will notice 2 commands:

  • AddFolderWithCategory
  • AddLink

By clicking on the according command in vapiTrace, you will notice the according arguments and the response from the server. Pretty cool, huh?

IMPORTANT!! In order to allow vapiTrace, and same is with Fiddler, to track the HTTP traffic, you have to login into Vault with a valid host name. Localhost does not work, it has to be the valid computer name or IP address.

You will also notice, that vapiTrace only tracks the real API calls made from the client to the server. So basically, you’ll see the calls made through the WebserviceManager, but not the calls made with the VDF.

I hope this tool will help you get more familiar with the Vault API, or maybe help you trace some problems with your customization, or the like.

vapiTrace is free and you can download the executable here. If you like to see the code, we will share it shortly on gitHub, so revisit this post from time to time. If you have suggestions for improvements, just leave a comment here or contact us directly (support@coolorange.com).

Posted in Free apps | 2 Comments

Trigger jobs at specific time intervals

We looked for this feature for years. Queuing jobs on a regular time period. With powerJobs 2015, we introduce such capability. It’s now possible to place a job into the queue every day at 10 o’clock, twice a week, every 10 minutes, or whatever time interval might be more suiting you.

For each job you create, you can also configure a .settings file which contains the definition of the time frame where the job shall be queued, the priority and the message that will show up in the job-queue.

So let’s suppose you have a folder in which your ERP System places text files. Such text files contain document numbers. For such document numbers, the according files shall be copied into a specific folder that is accessible for ERP users.

The job looks quite simple (ScanFolderAndCopyFile.ps1):

$files = Get-ChildItem c:\Temp\erp -File
foreach ($file in $files) {
     $content = Get-Content $file.FullName
     foreach ($line in $content) {
          Get-VaultFile -Properties @{'File Name' = $line } -DownloadPath c:\Temp\ERP\download
     }
     Remove-Item $file.FullName
}

It gets the files from the specific folder, in my case c:\temp\erp. It reads the content of the files in such folder and expects that the files just contain the desired file name. For each file in such folder, and for each line in each file, the according file will be searched in Vault and copied down to the specified location.

The settings file looks like this (ScanFolderAndCopyFile.settings):

{
 "Trigger":
  {
    "TimeBased": "0 0/1 * 1/1 * ? *",
    // This is the name of the Vault you want to trigger the job
    "Vault":"Vault",
    // And this two parameters are optional and self explaining:     "Priority":10,
    "Description":"look for files to be copied"
  }
}

The key thing in this file is the TimeBased property. The syntax is cron (unix), and for those who are not familiar with, here is a very handy website: http://www.cronmaker.com. Just enter the settings you like, and it generates the syntax which you can then use in this file.

That’s it! You have now a job that gets queued every minute, or depending on how you configured it, scans a folder for data and processes the data. Of course this sample is very simple, but I hope you get the potential behind it. Now it’s up to you to think of all the jobs that you could create with this capability!

Have fun!

Posted in powerJobs | Leave a comment

Job-creation made simple

Let’s say you want to do email notification on lifecycle change in Vault. Here is how the email might look like: powerJobs email In order to achieve this, you want to extend the Vault JobProcessor with an email job. Have you ever tried to create a custom job for your Vault JobProcessor using the Vault API? Here is a short list of tasks you have to do:

  • Install the Vault SDK (software Development Kit)
  • start Visual Studio and create a DLL-Project
  • reference the Vault API assemblies from the SDK
  • Extend your class with the according JobProcessor interface
  • Implement the interface functions
  • Start writing your code (expects coding skills and familiarity with Vault API)
  • copy the DLL bundle into the Extension folder
  • configure the config file in order to inform the JobProcessor about the new job

The tasks above are not familiar to you? Oh, you are not a developer? But you want to do more with your JobProcessor, right? Well, you are not alone! Now, let’s have a look how this works with powerJobs:

  • create a file with the extension .ps1 in the powerJobs\Jobs folder
  • start coding using a much simpler scripting language

Can you see the difference? All right, if you are still with me, then let’s do a practical example. Your interest might be around PDF creation from your CAD files. There is a standard job for this coming with powerJobs, and there are tons of ways to tailor it. But for the purpose of this blog post, let’s see how to create email notifications with powerJobs.

This way we get notified by the JobProcessor, every time a file gets either released, rejected, reviewed, etc. And here is how it works:

Start by creating a new file with the extension .PS1 in the powerJobs\Jobs folder. The easiest way to get there is to double click on the “powerJobs configuration” shortcut on your desktop. Let’s call the file sendEmail.ps1. Just pay attention when you create the file that the extension is really ps1 and not .ps1.txt or similar. You may also see that the symbol for your new file is identical to the one of the other PS1- files in the same folder. Edit the file with any text-editor. Well, it’s better to use either the Windows PowerShell-ISE (Integrated Scripting Environment) or download free editors such as powerGUI or PowerShell Plus. This is the content of the script we are looking for:

$file = PrepareEnvironmentForFile #get the file object of the processed file

#region collect information on lifecycle transition
$lfcTransId = $powerJobs.Job.Params[&amp;amp;quot;LifeCycleTransitionId&amp;amp;quot;] #get the lifecycle trasition ID for the lifecycle transition the job has been triggered for
$lfcStateTransition = $vault.LifeCycleService.GetLifeCycleStateTransitionsByIds(@($lfcTransId)) #get the lifecycle trasition definition for the lifecycle transition the job has been triggered for
$lfcStateTransition[0].FromID
$lfcStates = $vault.LifeCycleService.GetLifeCycleStatesByIds(@($lfcStateTransition[0].FromID,$lfcStateTransition[0].ToID))
$fromState = $lfcStates[0]
$toState = $lfcStates[1]
Add-Log -Text &amp;amp;quot;file $($file.Name) transiting from '$($fromState.DispName)' to '$($toState.DispName)'&amp;amp;quot;
#endregion

#region collect information about the involved users
$allUsers = $vault.AdminService.GetAllUsers()
$user = $allUsers | Where-Object { $_.Name -eq $file.Originator }
$causer = $allUsers | Where-Object { $_.Name -eq $file.&amp;amp;quot;Created By&amp;amp;quot; }
Add-Log -Text &amp;amp;quot;file orginated by $($user.Name) and last modified by $($cause.Name)&amp;amp;quot;
#endregion

#region build hyperlink to Vault file
$serverUrl = New-Object  System.uri($vault.AdminService.Url)
$filePath = $file.&amp;amp;quot;Full Path&amp;amp;quot;
$filePathUrl = $filePath.Replace(&amp;amp;quot;$&amp;amp;quot;,&amp;amp;quot;%24&amp;amp;quot;).Replace(&amp;amp;quot;/&amp;amp;quot;,&amp;amp;quot;%2f&amp;amp;quot;).Replace(&amp;amp;quot; &amp;amp;quot;,&amp;amp;quot;+&amp;amp;quot;)
$fileUrl = $serverUrl.Scheme+&amp;amp;quot;://&amp;amp;quot;+$serverUrl.Host+ &amp;amp;quot;/AutodeskDM/Services/EntityDataCommandRequest.aspx?Vault=Vault&amp;amp;amp;ObjectId=$($filePathUrl)&amp;amp;amp;ObjectType=File&amp;amp;amp;Command=Select&amp;amp;quot;
Add-Log -Text &amp;amp;quot;hyperlink: $fileUrl&amp;amp;quot;
#endregion

[System.io.file]::WriteAllBytes(&amp;amp;quot;c:\Temp\image.bmp&amp;amp;quot;,$file.Thumbnail.Image) #save the file Thumbnail as picture so that it can be attached to the email

#region prepare and send the email
$emailSubject = &amp;amp;quot;The file $($file.Name) trasitioned from '$($fromState.DispName)' to '$($toState.DispName)'&amp;amp;quot;
$emailFrom = &amp;amp;quot;aficio@coolorange.com&amp;amp;quot;
$emailTo = $user.Email # &amp;amp;quot;marco.mirandola@coolorange.com&amp;amp;quot;
$emailBody  = &amp;amp;quot;
Dear $($user.FirstName), the file &amp;amp;lt;b&amp;amp;gt;$($file.Name)&amp;amp;lt;/b&amp;amp;gt; transitined to state &amp;amp;lt;b&amp;amp;gt;$($toState.DispName)&amp;amp;lt;/b&amp;amp;gt;, as &amp;amp;lt;b&amp;amp;gt;$($causer.FirstName)&amp;amp;lt;/b&amp;amp;gt; changed the state.

&amp;amp;lt;img src=&amp;amp;quot;cid:image.bmp&amp;amp;quot; alt=&amp;amp;quot;&amp;amp;quot; /&amp;amp;gt;
&amp;amp;lt;a href=&amp;amp;quot;$fileUrl&amp;amp;quot;&amp;amp;gt;$($filePath)&amp;amp;lt;/a&amp;amp;gt;

If you like to contact $($causer.FirstName) for further explaination, send him an via at $($causer.Email)

sincerely, your humble powerJobs servant
&amp;amp;quot;
Add-Log -Text &amp;amp;quot;Sending email to $emailTo, from $emailFrom, with Subject $emailSubject&amp;amp;quot;
Send-MailMessage -From $emailFrom -To $emailTo -Subject $emailSubject -Body $emailBody -BodyAsHtml -Attachments @(&amp;amp;quot;c:\Temp\image.bmp&amp;amp;quot;) -SmtpServer &amp;amp;quot;10.0.0.26&amp;amp;quot;
#endregion

Add-Log -Text &amp;amp;quot;Job completed&amp;amp;quot;

Now, the really interesting line is #36, where the Send-MailMessage commandlet is called with simple parameters, that even non-developers can easily understand. The variables above (all starting with $) should also be quite clear. In order to send a message with some individual Vault-related content, such as file information or information around the lifecycle transition etc., we have to “talk” with the Vault server and gather some more details.

Now, the good news is that for certain typical activities, powerJobs provides simple to use commandlets. Here an example: we want to know the file name, who performed the last change and know in which folder the file is located. If we would do it with the Vault API, then the tasks would be:

  • getting the properties definitions
  • getting the properties for the given file
  • getting the folder where the file resides

This would be about 5 to 10 lines of code. With powerJobs, the $file variable, set at the top of the script file, already contains all system and user defined properties and also information about folder location. So, one line does it all for you. Quite handy, huh?

As we like to provide some good information in the email, we have to identify the lifecycle transition we are in. As powerJobs does not yet provide simple commandlets for this task, we will use the Vault API.  These are the lines:

#region collect information on lifecycle transition $lfcTransId = $powerJobs.Job.Params[&amp;amp;quot;LifeCycleTransitionId&amp;amp;quot;] #get the lifecycle trasition ID for the lifecycle transition the job has been triggered for $lfcStateTransition = $vault.LifeCycleService.GetLifeCycleStateTransitionsByIds(@($lfcTransId)) #get the lifecycle trasition definition for the lifecycle transition the job has been triggered for $lfcStateTransition[0].FromID $lfcStates = $vault.LifeCycleService.GetLifeCycleStatesByIds(@($lfcStateTransition[0].FromID,$lfcStateTransition[0].ToID)) $fromState = $lfcStates[0] $toState = $lfcStates[1] Add-Log -Text &amp;amp;quot;file $($file.Name) transiting from '$($fromState.DispName)' to '$($toState.DispName)'&amp;amp;quot; #endregion

Additionally, we like to add a hyperlink in the email body that leads the user directly to the related file inside Vault. Such hyperlink must be dynamically created, and these are the according lines:

#region build hyperlink to Vault file
$serverUrl = New-Object  System.uri($vault.AdminService.Url)
$filePath = $file.&amp;amp;quot;Full Path&amp;amp;quot;
$filePathUrl = $filePath.Replace(&amp;amp;quot;$&amp;amp;quot;,&amp;amp;quot;%24&amp;amp;quot;).Replace(&amp;amp;quot;/&amp;amp;quot;,&amp;amp;quot;%2f&amp;amp;quot;).Replace(&amp;amp;quot; &amp;amp;quot;,&amp;amp;quot;+&amp;amp;quot;)
$fileUrl = $serverUrl.Scheme+&amp;amp;quot;://&amp;amp;quot;+$serverUrl.Host+ &amp;amp;quot;/AutodeskDM/Services/EntityDataCommandRequest.aspx?Vault=Vault&amp;amp;amp;ObjectId=$($filePathUrl)&amp;amp;amp;ObjectType=File&amp;amp;amp;Command=Select&amp;amp;quot;
Add-Log -Text &amp;amp;quot;hyperlink: $fileUrl&amp;amp;quot;
#endregion

That’s it!

Now, in order to try it out, you have to configure the automatic job queuing for the lifecycle transitions you are looking for. This can be done via the LifecycleEventEditor provided by Autodesk. The LifecycleEventEditor is one of the many little secrets of Vault. This tool can be found in the Vault SDK (C:\Program Files (x86)\Autodesk\Autodesk Vault 2015 SDK\util\LifecycleEventEditor). The setup for the SDK can be found on the ADMS Server folder (C:\Program Files\Autodesk\ADMS Professional 2015\SDK) or in the Vault client (C:\Program Files\Autodesk\Vault Professional 2015\SDK). Run the setup and then start the LifecycleEventEditor. You will then select your preferred Lifecycle definition, lifecycle state and transition. Via the “Actions” menu you will be able to add the job to that transition and then commit the changes. The job name is simply the name of your PS1 file without the extension.

LifecycleEventEditor

As you now transition your file from one state to another, then this might be the email that will show up in your inbox:

powerJobs email

Doesn’t it look cool??? So, now that you have the technique, you can extend this job to any other scenario.

Bottom line, with powerJobs it’s super simple to create a new job and thanks to the new commandlets, it’s also very simple to deal with the Vault API, even for unskilled developers!!! Have fun!!

——

P.s.: if you like to test the script and don’t have an SMTP server at your hand, you may want to use your google account. In order to do that, you have to pass to the Send-MailMessage some more arguments, such as your credentials and the according SMTP port for Gmail.

Send-MailMessage ....... -SmtpServer "smtp.gmail.com" -UseSsl -Credential $cred -Port 587

You see that the last arguments are -UseSsl for secure connection, -Credential with the according variable containing you userID and password, and the -Port 587. Now, the $cred needs to contain you username and password and the command-let Get-Credential does this for you.

$cred = Get-Credential

It will ask you interactively for your username and password. This is great, however, you don’t want to enter such things every time the job gets executed and as the job gets executed silently you wouldn’t have the ability to enter such information. For this purpose, I’d suggest to execute the command one time within a running PowerShell and store such information into a XML file that can be reused within the job. Here is the code:

$cred = Get-Credential
$cred | Export-Clixml c:\temp\cred.xml 

Now your credentials are stored into a local file. This file can be read at every job like this:

$cred = Import-Clixml c:\temp\cred.xml
Send-MailMessage ....... -SmtpServer "smtp.gmail.com" -UseSsl -Credential $cred -Port 587

Once this is set up, you still will get troubles from Gmail, as it does not allow to use the SMTP server from every application. In order to change such setting, login into your google account, then follow this link https://www.google.com/settings/security and finally set the “Access for less secure apps” in the “Account permissions” section to “Enabled”. Of course you should set it back once you’ve finished your testing/demo with this job.

So, now you should be able to use Gmail as your SMTP server.

Posted in powerJobs, PowerShell, Vault API | 2 Comments