Querying the Vault database

One of the most common restrictions I’m hearing about Vault, is that there is no way to access the database or to query it. Well, besides the fact that touching the Vault database is really a bad idea, the question is what does it mean querying the database? If it’s about gathering information or writing information into it, then yes, it’s definitively possible – via Vault API.

I know, you say “I’m not a developer” or “I don’t always want to build an application for some quires”. Well, you don’t have to. In the old days you may have wrote some SQL statement, so you might be somewhat familiar with SELECT, FROM, WHERE, ORDER, etc. If this is the case, then I have a quite cool new technique for you – Microsoft PowerShell.

It’s a scripting language that comes with your Windows 7 for free and is available for XP, and Vista too. This language is quite powerful, even though it reminds a bit to the old DOS days, but with absolutely fascinating new features. You may wonder “oh boy, another language to learn”, yes, but it’s easy and well invested time!

Enough, let’s get started. The tools are already on your machine. Install the Vault SDK. The setup is in the Vault 2012 client folder.  You then will find the installed SDK  in  C:\Program Files (x86)\Autodesk\Autodesk Vault 2012 SDK. PowerShell is already on your Windows 7. If you run an older Windows version, then get PowerShell 2.0 from the Microsoft page. Under Start>All Programs>Accessories>Windows PowerShell you’ll find the Windows PowerShell ISE (x86). As Vault is a 32 Bit application, you will use the 32 Bit PowerShell editor. We suggest you to use the PowerGUI editor. It’s free too, and it’s by far more powerful, with intellisense, code snippets, and more.

In our first sample we will just log into Vault and create a new folder under the root $/ folder. The first 3 lines of code are just loading the Vault webservices, creating a Vault WebserviceManager object, which we will use for talking with the Vault server, and finally login. Adapt the path to the DLL and the login information to your environment. These first rows comes quite close to what you did in the past with SQL in terms of connecting to the database, and now you can do it with Vault.

Add-Type -Path "C:\Program Files (x86)\Autodesk\Autodesk Vault 2012 SDK\bin\Autodesk.Connectivity.WebServicesTools.dll"
$cred = New-Object Autodesk.Connectivity.WebServicesTools.UserPasswordCredentials ("localhost","Vault","Administrator","")
$webSvc = New-Object Autodesk.Connectivity.WebServicesTools.WebServiceManager ($cred)
$rootFolder = $webSvc.DocumentService.GetFolderRoot()
$newFolder = $webSvc.DocumentService.AddFolder("coolOrange", $rootFolder.Id,$false)

In the last 2 rows, you’ll notice we use the $webSvc object, which is now our connection to Vault. On a SQL database we would now work with tables and view, as Vault is based on webservices, instead of accessing the document table, we access the document webservice. A SQL table would have specific columns with specific rules. The document service offers for documents (files) specific functions. The first function we use is the GetFolderRoot(), and the object we get has couple of properties, like a record has couple of columns. The second command is AddFolder, which is like an INSERT, and this function asks us the name for the new folder, the parent folder where to add the new folder, and if the folder should be of type library or not. Try it, go in Vault and you see, you have added a new Folder. Cool ha? Just with 5 Lines, and first 3 are just for login.

Actually, the login we made is consuming a license, which allows us to read and write from and to the Vault server. If you like to just login read-only, without using a license, then add at the end of the credentials arguments (second line) a $true.


$cred = New-Object Autodesk.Connectivity.WebServicesTools.UserPasswordCredentials ("localhost","Vault","Administrator","", $true)

The next sample will get files from a specific folder, so it’s like a SELECT from the document “table”. Once we have the results, we will filter them (WHERE), sort them (ORDER BY) and finally export them to a file. Just add these lines to your script.

$myFolder = $webSvc.DocumentService.GetFolderByPath("$/Designs")
$files = $webSvc.DocumentService.GetLatestFilesByFolderId($myFolder.Id, $false)

Adapt the path in the GetFolderByPath to the folder you like to query. Let’s collect the files within that folder with GetLatestFilesByFolderId. Every object ( could be seen as a “record” in SQL) exposes an ID for unique identification, like the primary key in a database table. Thus, to get files from a specific folder, the function asks us for the folder id and whether we like to get hidden files (i.e. DWF) or not.

Now that we have the files, let’s write them into a file. Usually we would have to iterate trough the list of objects and write them out one by one. PowerShell brings a brilliant concept for doing this more elegantly – the pipeline. While the UNIX pipeline processes strings, the PowerShell pipeline process objects. So we can pass our collection of files through the pipeline and let something happen, like write to a CSV file:

$files | Export-Csv c:/test.csv -Delimiter ";"

You tried? You’ll see now the content of the files as a comma separated value (CSV) output. The magic is the pipeline symbol | which takes the object from the left and process it with the command on the right. The Export-Csv is one of the thousands commandlets that comes with PowerShell. There is a function for almost all you look for!

Now, let’s take the list of files and filter just for those which contains “test” in the name. In the CSV you’ll noticed that every file object has many attributes (columns). One attribute is the Name which corresponds to the file name. So, try this:


$files | Where-Object { $_.Name -like "*test*" } | Export-Csv c:/test.csv -Delimiter ";"

You see? We have chained two pipes, one behind the other. The first pipeline filters the list of files against name like *test*, and the second pipeline processes the filtered data in to the CSV file. The Where-Object commandlet is a wonderful helper for filtering. Inside the { } we can access to the properties of the object by using internal name $_, so a $_.Name means the Name property of each single file in the files list. The operators in PowerShell are prefixed with a -, so –and, -or, -like which expects the text to compare with. The wildcards can be used to refine the comparison. In this case *test* means all that contains somewhere the expression “test”. Try also “test*”, for begins with, “*test” for ends with, etc.

Now that we have the list of files reduced to those which responds to the file name *test*, let’s sort them.


$files | Where-Object { $_.Name -like "*test*" } | Sort-Object CreateDate -Descending | Export-Csv c:/test.csv -Delimiter ";"

As you can see, we again chained another pipeline. Of course these steps can be done one by one, however, it’s cool to see how filtering (WHERE), and sorting (ORDER BY) can be done at once, and export the output. With the Sort-Object commandlet you just have to call our which property should be ordered in which direction.

So, now you know how to access the Vault and how to query folders and files. We hope you enjoyed this trip, and like to welcome you on the next one to explore about PowerShell with Vault. Squeeze your potential!

This entry was posted in PowerShell, Vault API and tagged , , , , , , , . Bookmark the permalink.

11 Responses to Querying the Vault database

  1. Mathias Ziegler says:

    In Vault 2013 SDK Folder there is no WebservicesTools.dll. Does this Tutorial only works with 2012 ? GZ for the new Blog.

  2. Mathias Ziegler says:

    Oh i found the Solution ;) Sorry

    Change: Autodesk.Connectivity.WebServicesTools.dll has been removed.
    All the code has been moved into Autodesk.Connectivity.WebServices.dll. So no code change is needed, you just have one less assembly to link to and deploy. The WebServicesTools namespace has not changed.

  3. MaxU77 says:

    Our Vault Database contains files with Cyrillic characters in their names.
    The sample above returns questionmarks (“?”) instead of those charachters.
    I wonder Is it some known limitation and if there any way to have Unicode support.
    Thanks in advance.

    • Marco Mirandola says:

      Hi Max, I think the problem you have is with the Export-CSV cmdlet. You can just add -Encoding unicode in order to let the cmdlet encode your strings in unicode or other code pages. The Vault and PowerShell looks to work fine with unicode, just when it comes to write the content down to a file, code pages might be relevant.
      hope this helps

  4. MaxU77 says:

    Another question:
    Does anybody know why there is no some system file properties in the exported csv?
    For example, as a Vault user I’m used to relay on property while doing search. Using the property I can easily separate “Inventor” and “Autocad” files from all the rest filetypes.
    Should I somehow deliberately switch on those missing properties?
    Thanks in advance.

  5. MaxU77 says:

    Sorry, I meant …. to rely on system property…

  6. MaxU77 says:

    seems like this forum “eates” text enclosed to less and more signes.
    My favourite search property name is PROVIDER

  7. I was wondering how to do a search multi level deep. So the top folder and below. The example only scans the top level.

    • Marco Mirandola says:

      in order to scan also sub levels, you have two options. one is to get a list of folder ids and then get the files for such folders, like this:
      $rootFolder = $webSvc.DocumentService.GetFolderByPath(“$/Designs”)
      $folders = $webSvc.DocumentService.GetFoldersByParentId($rootfolder.Id,$true)
      $folderIds = @()
      $folders | ForEach-Object { $folderIds += $_.Id}
      $files = $webSvc.DocumentService.GetLatestFilesByFolderIds($folderIds, $false)

      the other is to use the search function which allows to set a folder and define wether the search shall be recursive:
      $srchCond = New-Object autodesk.connectivity.WebServices.SrchCond
      $srchCond.PropDefId = 32
      $srchCond.SrchOper = 1
      $srchCond.SrchTxt = “ipt”
      $srchCond.PropTyp = [Autodesk.connectivity.WebServices.PropertySearchType]::SingleProperty
      $srchCond.SrchRule = [Autodesk.Connectivity.WebServices.SearchRuleType]::Must
      #$srchSort = New-Object autodesk.connectivity.WebServices.SrchSort
      $bookmark = “”
      $status = New-Object autodesk.Connectivity.WebServices.SrchStatus
      $files = $webSvc.DocumentService.FindFileFoldersBySearchConditions(@($srchCond),$null, @($rootFolder.Id),$true, $true, [ref]$bookmark, [ref]$status)

      in order to understand the arguments, especially for the search, you can look into the Vault SDK, or as i did, use the vapiTrace by performing a search in Vault and looking via vapiTrace the arguments passed to the function.

      btw, if you use 2015 R2, we will soon release powerVault wich is a set of ready to use command lets that makes the consumption of the Vault API much easier. So, for instance getting a list of files will by like Get-VaultFiles -Properties=@(“FileExt”,”ipt”). so, stay tuned!

  8. Pingback: powerVault – Vault API made simple | coolorange

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s