Cloudy times…


This week we are attending the Forge Accelerator event in Munich, together with a lot of great Autodesk people and other Forge fans like us. Btw., Forge is the name under which Autodesk combines all the new cloud APIs. On you can find all the available APIs and documentation and on you can find more details about Forge.

This week we gave a closer look to the Data Management API, Design Automation API and Viewer API. Today I’ll give you our outcome from our findings about the Data Management API.

The Data Management API is responsible to deal with file objects, upload, download, set references and the like. Surely it will be soon extended with more capabilities.

From a user prospective there is a bit of confusion, as there is A360 Drive, which is a pure file collaboration service similar to DropBox, OneDrive, GoogleDrive, etc., then there is A360, and then there is the Bim Team and Fusion Team. Fore more clearance about the naming, give a look to

A360 Drive has a desktop sync tool that syncs your files with it, however there is no API. A360 is accessible through the Forge Data Management API, and we will talk about it here, and now it has a desktop sync tool called Autodesk Drive which you could download through Fusion Lifecycle as seen in this video. Actually, A360, Fusion Team, BIM 360 Team are just the name of the consumer application. The technology underneath is all based on Forge API. So, at the moment it’s all in evolution, but Autodesk is moving very fast, so if you read this blog post in few months, it might be even no longer valid and the product, names and technologies are well streamlined.

Anyway, let’s talk about the Data Management API. It allows you to create hubs, projects, folders, items (files), versions, and references. So, you can create your custom upload tool for bringing all your data into the cloud, the way you want. As an example, we though how could we bring the files from Vault into the cloud. The sync tools (now called Autodesk Drive) will just upload the latest version of your files, but what if you want to take over the whole history? A simple approach could be to export the data from Vault via BCP, which generates a dump of all the data on local disk as XML files, process the BCP package and upload the files including the versions in A360. This way, you would not only get the latest version of your files, but also the whole history. This is pretty cool!

We played with the file version and gave a look whether the files must be uploaded in the right sequence, or if it’s possible to upload older versions to a later point in time. We could not make it work in the short time, but it seems to be possible – we will investigate further. We also played with references. We could recreate the references of an Inventor assembly and they were visible in the web user interface, but however, the references were not resolved, as the viewer was not able to display the assembly. It turned out that at the moment, just the Fusion360 references are supported, however, thanks to one of Autodesk fellows, we could give a sneak peek on an upcoming API extension where also Inventor references are supported. So, I guess that very soon the scenario we are looking for will be possible. Exciting!!!

Our take away is that the Data Management API already offers all the basic functions in order to create projects, upload files, versions, set references, etc. The current documentation is pretty good and there are already lots of examples, and from what we’ve seen, there is more to come – very soon.

Posted in Forge | 1 Comment


2016-10-01_12-40-32Have you ever had the need to display, check and process structured data in a custom way? Let’s say you want to release a complete assembly, and want to perform a series of custom checks and actions on that structure. Please welcome powerTree!

I know, we need to find a new technology, so that we can define new product names. But meanwhile our love (and obsession) for PowerShell should be well known. And it will grow, now that PowerShell went open source and it’s available on many platforms, and with PowerShell version 5, debugging capabilities have been introduced.

Anyway, back to our problem. In recent projects, we faced the need to navigate through a structure and perform custom checks and operations. In our particular case, we had to check a quite large assembly, including all drawings, and test whether all the company’s business rules comply. We actually had to release the complete assembly, including drawings, but in case where for some reasons the drawings could not be released, the parent assemblies could also not be release. The default Vault dialog cannot perform this, so we had to invent something new.

Instead of doing something custom, we decided to create a little tool that give us the flexibility to do this kind of customization in a flexible and repeatable way. So, we developed a dialog that shows data as a structure, with configurable actions. All the logic is once again in a PowerShell script, so that we can define and tweak the logic as we need, without affecting the dialog’s source code. This way, the dialog becomes useful in many different situations.

Here is the sample implementation from our project. First, we had to collect the structure of the assembly, including all the drawing. Second, we had to check the compliancy of the components from bottom up. Third, we had to perform cleaning actions on the problematic components.

For collecting the components with drawings, we had to integrate the drawings in the structure, in order to ensure that the upper level assembly can only be OK if all the children and related drawings are OK too. As drawings are parents of the model, we switched the position of drawing and component, so that the drawing becomes the child and the model becomes the child of the drawing. OK, give a look to this picture, it makes it easier:


In the example above, the assembly 10001.iam contains a child 10018.iam. However, the child 10018.iam, has one or more drawings. Therefore, we switched position between model and drawing, so that the 10018.idw becomes the child of 10001.iam. This way, if we now perform our check and the drawing 10018.idw has a problem, the top assembly 10001.iam cannot be released.

Another requirement was to be able to work in Vault, while the dialog is open. So, in case where the check finds errors, the user can fix the problems in Vault while keeping an eye on the list of problematic files.

The dialog can be extended with custom action, for instance, the “Go To Folder”, which makes it simple to jump to the problematic file in Vault and fix the problem. However, any sort of action can be configured. And the same is with Check and Process. Both buttons run a PowerShell function, therefore what should happen during the check and the process, is up to you.

As mentioned earlier, you can load into the dialog whatever you want, the way you want. It just has to be a structure (parent-child relation). The object types could be mixed. So, here for instance we load the folder structure and the according files in one dialog:


Here another example, where we load the file structure, like for the first case, but also load the according items:


Technically, it’s also possible to combine external data sources. So, for instance in the case above, the item might come from the ERP in order to check if every relevant component has a valid item.

So, the possibilities are endless. If you have the need to perform a custom check or custom operations over a custom data structure, then powerTree might be the cost effective way. Talking about costs. This is not a product you can download or purchase. Given the complexity of the topic, we provide access to this tool in the context of a project. So, if you have the need for powerTree, just get in touch with us via and ask about powerTree. We will then discuss with you your requirements and ensure that powerTree is the right tool for you and define how the configuration will be done.

I hope I was able to give you an overview of what powerTree can do for you, and look forward for your feedback.


Posted in Data Standard | 1 Comment

Let’s talk…


Over the past years, we had the pleasure to provide you information, tips, samples, code snippets, and first and foremost ideas on what can be done with Autodesk Data Management and other topics we are working with. This was and still is our primary goal – to light up your imagination on what the products and technologies you already have at your fingertips could do for you, with minimal or reasonable customization effort.

So far your response was great, in terms of page views, subscriptions to this blog and comments. Thanks you very much for following us and give us the energy to continue on this path!

We recognized that especially in reacting on comments, we could do better. Most comments are contributions, corrections or just appreciations. Thanks again for all this support! Some comments are requests for help, as some code does not work as expected, or there have been problems in the interpretation or implementation, or maybe the code is outdated. Reacting on those comments via the comment chat of the blog is a bit cumbersome. Therefore, we activated a forum (, where topics like this can be better discussed.

Thus, from now on, if you have a comment, contribution, correction or just want to share your appreciation, please continue to use the comment section of this blog. That is the right place for such comments.
In case where you have issues with the posted idea or code sample, or you want to get in discussion with us and others, or want to leave a comment, suggestion idea that is out of topic and does not fit well to any existing blog post, then the forum is the better place. There, we can have a relaxed and more extensive conversation. The conversation will still remain public, and so accessible to everyone and therefore extend the content of the blog.

The forum can be reached via While the content is visible as a guest, you will have to sign up for contributing. I do encourage every one of you to take advantage of this additional communication platform.

When using the forum, please keep in mind that the blog posts are deliberately hold short and simple, with the purpose to show the way and create interest and excitement. Therefore, the code published on this blog, although it does work, may not be always immediately applicable in a productive environment. Also, we pursue new topics and ideas, and seldom we update old topics to new versions. However, the forum could be a good place for discussing such things.

Also, keep in mind that both the forum and the blog are free (no costs), although the effort for writing the posts and following up the forum is tangible. Therefore, in case you need a personal and timely assistance, please take advantage of our paid support. They can help you by reviewing your code, do a remote session and give a look on your machine, write a custom code snippet, and more. If you need such time of professional support, just reach out via email to and they will then provide more detailed information. However, use the forum to start the conversation.

Also, don’t forget to use the Autodesk Vault Customization forum for Data Standard related topics and other Vault customization issues. We are also quite active there. In case you’re not sure which forum to use for Data Standard questions, the Autodesk forum is always good as, there is already a large VDS community there, it’s visible to Autodesk, and questions and discussions around VDS are quite interesting for a larger audience.

Our new forum, really want to provide a better way to interact on coolOrange’s blog related topics.  We hope you will enjoy the new communication tool. We definitively will enjoy the conversation with you.

Posted in Uncategorized | Leave a comment

Debugging VDS


Are you running Windows 10? Then I have a good news for you! You have already PowerShell 5 installed, and with that also the capability of debugging Vault Data Standard step by step. You are not running Windows 10? Then continue reading, as PowerShell 5 can be also installed on earlier Windows version.

With PowerShell 5, Microsoft introduced new command-lets that lets you connect to a running application, such as Vault, Inventor or AutoCAD and debug the PowerShell runspace hosted by such application. By doing so, the according PowerShell script will be opened in your PowerShell editor, so that you can execute the code line by line, see the current state of variables, and see how the code behaves. This is brilliant if you have a nasty problem with your code and like to figure it out.

Let’s see how this works. Let’s suppose you like to debug the Vault Data Standard New File dialog. Start the PowerShell ISE and let’s connect to the Vault application. For this purpose, there is a new command-let Enter-PsHostProcess -Name <ProcessName>, where the <ProcessName> is in this case Connectivity.VaultPro. You can enter Enter-PSHostProcess -Name Connectivity.VaultPro either in the command line or write the code in an empty script, select the line and execute it. You will notice that your command prompt changes from PS C:\>, to something like [Process:9436]: PS… . This means that you are now connected to the PowerShell engine of the hosting application. It is possible that the hosting application, in our case Vault, might work with several runspaces, which is the case for VDS for Vault. With Get-Runspace you can ask the list of available runspaces. You may get something like this:


As you can see, here we have several runspaces. One of these is called RemoteHost – this is our PowerShell editor, so we can ignore it. The more interesting are the other runspaces. The question is, which runspace is the one we like to debug? In order to know, which one is the right one, we can use a trick. We bring up a message box from with the VDS code that tells us the runspace ID and also holds the code at that position, so that we have the time to start the debug mode with such runspace. For this purpose, I wrote a little function called ShowRunspaceID, that looks like this:

function ShowRunspaceID
            $id = [runspace]::DefaultRunspace.Id
            $app = [System.Diagnostics.Process]::GetCurrentProcess()
            [System.Windows.Forms.MessageBox]::Show("application: $($"+[Environment]::NewLine+"runspace ID: $id")

Just create a new file, for instance ShowRunspaceID.ps1, copy paste the code into the file and save the file nearby the Default.ps1 file. Restart Vault, in order that the new file is loaded by VDS. Now you can add the ShowRunSpaceID at any place in your code. As an example, you can add it at the beginning of the InitializeWindow function like this


Create now a new file or folder in Vault via VDS and you should see the message box with the application name and the runspace ID. Now get back into your PowerShell Editor and enter the command Debug-Runspace -id <ID>, where <ID> is the ID provided by the message box. A short message should “welcome” you in the debug mode. Now, just close the message box, so that the code can continue. You will notice that the editor will open up the PowerShell script file and highlight the line that is currently executed. Now you can move on step by step either with F10 (step over) of F11 (step into). In order to exit the debug mode, you can either stop debugging in the PowerShell editor or just type detach in the console.

In summary, these are the two command-lets that you need in order to start debugging your VDS code:

Enter-PSHostProcess -Name Connectivity.VaultPro
Debug-Runspace -id 50

Just set the process name and the sunspace ID accordingly to the text in the message box.

We hope this new capability will help you to quicker identify issues in your code or help you understand the behavior of your VDS environment. Enjoy!


Posted in Data Standard, PowerShell | 1 Comment

VDS 2017

2016-08-19_08-02-28Data Standard 2017 have been released several weeks ago, and there have been some interesting improvements that brought us to update the VDS quick reference guide. VDS2017 brings a lot of little internal improvements, which will make customizing simpler. So far, there were quite differences between VDS for CAD and for Vault and such differences have been streamlined.

Aside the internal improvements, there is one new feature around the Save Copy As function in VDS for Inventor. In the early days, all save functions were mapped to the VDS dialog. So, regardless if you just saved, saved as, or saved copy as, in all three cases the VDS dialog showed up. Early customers immediately claimed the ability to perform a save copy as without VDS in order to save the files in different formats, and this is the behaviour as you know it today. Save and Save as are caught by the VDS dialog, and the Save Copy As is still the original Inventor dialog where you can save the file in a different format, where you want.
With VDS 2017 you have now a new Save Copy As function within the VDS ribbon. This allows you to save, actually to export, your file in some other formats directly into Vault. So, the VDS dialog still shows up and asks you to complete the form, but it also allows you to save the file in other formats and stores the file directly into Vault. So, the original Save Copy As is still the same and a new Save Copy As has been added. There is a very nice post  on the cadlinecomunity page with a nice video that gives a good overview. If you like to know more about this feature, leave a comment, and we can dig into it in a separate post.

Let’s get back to the internal changes in 2017. You will notice that the _Category and the _NumSchm property are now exposed, in VDS for CAD, as a custom property called VDS_category and VDS_NumSchm. This makes it simple to align the category chosen in CAD with the one assigned by Vault. Now you can map a user defined property in Vault with the VDS_category property of Inventor and AutoCAD and use the new property for defining a simple rule.

A very important change that will reduce a lot of customization, a are logic between category and numbering scheme. In case your numbering scheme is called the same way as the category, then it will be automatically preselected. Just try to create a category “drawing” with a numbering scheme “drawing” and a category “models” with a numbering scheme “models”. You will see that as soon you select drawing as category, the numbering scheme will be set accordingly, and as soon you switch to models, the numbering scheme will change too. You still can choose another numbering scheme, but in case they are both called the same way, then the pre-selection is automatic. That simple!

A new internal property called _OriginalFileName have been added. It’s useful when a copy operation is performed and you want to know what was the original file name. So, via $Prop[“_OriginalFileName”].Value, you can now get the file name of the original file.

The internal properties _CreateMode and _EditMode are now available in VDS for Vault as well. So, now it’s simple to know in which state the dialog is.

Also, a lot of logic that, so far was inside the XAML file, now have been moved to the Default.ps1. This way, logic is in PowerShell and graphics in XAML. It should be easier now to influence the behaviour. So, you’ll find now new PowerShell functions such as IsVisibleNumSchems, ShouldEnableFileName, ShouldEnableNumSchms, which defines the standard behaviour of the numbering scheme. So far, the logic was inside the XAML with triggers and the like. Now, the XAML points to these functions an here you can easily define the behaviour.

One big change in VDS 2017 is the change from the mymenu.mnu to the MenuDefinitions.xml. This change was a must since a while. By moving to a XML file, a lot of bugs with special characters and the like have been fixed and the logic should be now simpler. You can find more details on the Autodesk knowledge page.

Working with VDS and custom objects should now be much simpler. You get a VDS dialog template (CustomObject.xml, CreateCustomObject.ps1 and EditCustomObjects.ps1) which can be used with any custom object. Just add in the MenuDefinitions.xml the according menu item for your custom object. In the MenuDefinitions.xml you’ll find a sample for a custom object called CustomObject :-). So, just duplicate the CustomObject element in the CommandSite section and rename the Location property to the name of your custom object. Of course, it would be better to also duplicate the menu items in order to have a custom menu labelling.
For custom objects, a new automatic logic has been introduced. Just create a category with the same name as the custom object and you are done. Next time you create a custom object, that category is automatically set. In case you want to enforce such category, just set the combo-box to read-only or inactive in the XAML file and the user will not be able to change the category. In case you also have a numbering scheme with the same name, then also that is automatically assigned. Done!

There is one more thing! With VDS 2017 you have now the application and document object available, for Inventor and AutoCAD. So, within your PowerShell script, you can access via $Application and $Document to the native API. This allows you to interact more deeply with the hosting application. Of course, you will have to know the according application API, but if you do, then you have way more options now.

In my view, with this version of VDS, a lot of typical scenarios are now covered in the standard or can be accomplished in a very simple way.

Here is the updated Data Standard Quick Reference 2017 . Have fun with VDS2017! We already have…


Posted in Data Standard | 3 Comments



Hurray!!! The 2017 products are ready for download! We made the beta versions available quite a while ago and received great feedback! Thanks to all of you that helped us getting this release done. You can go on the website and start with the download.

Especially in powerVault we got reported bugs in the Get-VaultFileBOM command-let, which retrieves the Vault BOM information attached to the file. In complex assemblies with phantoms, there were some issues with quantity calculation and the like. As powerVault is a core component for powerJobs and powerGate, and we know that you use powerVault a lot also with Data Standard and other stuff, we wanted to get this problem sorted out before releasing the new version. It was the needle in the haystack. It took some time and finally we removed the needle.

One very important change with 2017 is the backward compatibility. You will notice that all the Vault related products are now branded with the version 17, regardless if you install it on Vault 2017, 2016 or 2015. The reason is that we took the effort to create one single code base for all the supported Vault versions. In other words, a version 17 product will have the same features, look&feel and behavior no matter whether you are running Vault 2015, 2016 or 2017. It’s one and the same product, with the appropriate setup for the according Vault version. Moving forward, as we add more features, those will be also available for all supported Vault versions. So, customers under subscription will benefit from new features, even though they might stay on an older Vault version for a while. There will be just one documentation, and you don’t have to worry that a new cool feature might be just available for the latest Vault version. It makes it easier to support. And the compatibility of customizations is granted across the versions. This was something we looked for a while, and finally it’s there. We hope you will enjoy the simplification and benefits.

Talking about compatibility, we have to do an admission. You know that we care a lot of making sure that your investment in customization do not get lost when moving to a new release. We take this quite serious. With the new powerJobs version, we had to do a break. So far, the creation of PDF and other file formats was performed by one single function (Save-FileAs), which was doing everything: downloading the files from Vault to a local workspace, opening the file with the according application, exporting to the whished format such as PDF, closing the file and cleaning up the workspace. This worked great for years, but you asked for more and we listened.
With the new powerJobs, we splitted the Save-FileAs function into his core component, such as download, open, export and cleanup. This way, you have more control over the single actions. For example, you can now use the FastOpen feature of Inventor in order to speed up the opening process for a drawing. So, when you download the files from Vault, you might just download the drawing (without components) and open the drawing with the argument FastOpen (or level of detail, and other options). Just this improvement changed already the life of some customers, where with very large assemblies, the PDF creation of the drawing took more then 30 minutes and now it’s down to less then a minute.
You can also be more specific on the export, by passing arguments and so generate PDF files with no line weight, or export DWG with specific version, or DWF with just outer lines, and more. You can also export all the formats at once, and so reduce the number of jobs and repetitive download and opening of files.
Additionally, you get an easy access to the core application API, such as Inventor, and so before, during and after the export, do some more fancy stuff.
Bottom line is that this change was necessary and brings so many new possibilities and grants the compatibility for the future, as now the respective functions are smaller and so less affected by changes. When installing the new powerJobs, you will notice that we now deliver a lot more samples, where you can see all the improvements we made.
As I mentioned before, we keep the compatibility topic seriously. Therefore, on the wiki you will find the documentation for the upgrade and also a PowerShell script which will make your existing jobs run also with the new version. We basically provide the “old” functions in form of a downloadable script, which internally uses the new functions. Just follow the instructions and your jobs should run with the new powerJobs without additional effort. And in case you still need help, just reach out to our support. However, we strongly recommend you to embrace the new command-lets, as they provide way more flexibility and a much better error handling.

Where is powerGate? It’s on the way! Since we started about 2 years ago with powerGate, we had the pleasure to make quite some interesting projects with you and ERP systems such as SAP Netweaver, SAP Business One, Microsoft Dynamics NAV (Navision), ABAS, ProAlpha, Sage, and many more, and more are on the way. We learned a lot from such projects and decided to improve powerGate as well. Especially the powerGate client command-lets are under rework. We are adding much better and more logging in order to identify issues quicker. New command-lets are coming in order to reduce the amount of code in the scripts and so keep them even simpler and shorter. The error handling becomes much simpler. I will tell you more about this in a few weeks as we come closer to the release date.
As neither the powerGate client nor the powerGate server is bound to a specific Vault version, you can use the 2016 version and run it with Vault 2017. However, I bet you will quite enjoy the new powerGate and we are working hard to get it ready for you in the next 4 to 6 weeks, so stay tuned!

Over the year, we will have some more quite interesting announcements, but as for now, I whish you all the best with the new 2017 products!

Posted in Uncategorized | Leave a comment

VDS custom numbering


With Data Standard 2015 the number generation has been integrated within the VDS dialog. That is great! So, just configure the numbering schemes and you can use them right away in the VDS dialog. In case you like to offer just some numbering schemes for certain file type or category, you can just customize the GetNumSchms in the Default.ps1 in order to filter the numbering schemes according to your needs.

But in the case you need something more custom, you can go even beyond. In this post I’d like to show you how you can manage a more complex numbering scheme. We will have a number made of several components, where the values comes from properties of the VDS dialog. Additionally, we want to either generate the number for the combination of properties, or generate a new sheet for the given number.

In the standard dialog you have the list of numbering schemes, the preview for the selected numbering scheme, which also allows you to enter the data to generate the number, and the file name text-box. The numbering scheme preview control provides the ability to capture the users input, but in our case we like to have independent properties on the dialog for this.

Therefore, we will remove the selection of the numbering scheme, the preview and also the file name text-box and handle the number generation completely our self. The resulting dialog looks like this:


Now that all the controls are removed, when a new file is created, no number is generated at all. We will generate the number when the user hits the OK button. The appropriate function ist the OnPostCloseDialog. Here we can set the property DocNumber to our value. In order to do this, we will call the Vault number generator with our custom arguments and so generate a number the way we want.

Now, in our example the number looks like this: [Site]-[Discipline]-[Area]-xxxx-yy, where xxxx is the number and yy is the sheet number. So, a number could be IT-SK-01-1234-01. We will either generate the number (with sheet 01), or increase the sheet number on file copy.

For getting this working we need two numbering schemes, one for generating the number with sheet 01 and one for generating incremental sheets. Here are my two numbering scheme definitions.

2016-04-08_10-19-46 2016-04-08_09-51-11

Then we also need some properties where the user can select the values. In my case with Inventor, i just added further properties to the Inventor.cfg file. Such properties does not necessarily exist in Vault, unless you want this behavior also in Vault. This is how my Inventor.cfg looks like:

<PropertyDefinition PropertyName="Site" DataType="Text" />
<PropertyDefinition PropertyName="Discipline" DataType="Text" />
<PropertyDefinition PropertyName="Area" DataType="Text" />
<PropertyDefinition PropertyName="Dwg No" DataType="Text" />
<PropertyDefinition PropertyName="Sheet" DataType="Text" />
<PropertyDefinition PropertyName="NewSheet" DataType="Boolean" />

In the XAML dialog I’ve removed/commented the numbering scheme selection and the numbering scheme preview.


Instead of the file name text-box, I’ve added the 3 comboboxes for the Site, Discipline and Area, and also a check box which tells me wether just the sheet number should be increased.

<Label Content="{Binding UIString[LBL6], FallbackValue=File Name}" Grid.Row="14" Grid.Column="0" />
<!--<TextBox Grid.Row="14" Grid.Column="1" x:Name="FILENAME" Style="{StaticResource FileNameStyle}" Margin="0,2" />-->
<StackPanel Grid.Row="14" Grid.Column="1" Orientation="Horizontal">
   <ComboBox Text="{Binding Prop[Site].Value}">
      <ComboBoxItem Content="IT"/>
      <ComboBoxItem Content="UK"/>
      <ComboBoxItem Content="DE"/>
   <ComboBox Text="{Binding Prop[Discipline].Value}">
      <ComboBoxItem Content="SK"/>
      <ComboBoxItem Content="XR"/>
   <ComboBox Text="{Binding Prop[Area].Value}">
     <ComboBoxItem Content="01" />
     <ComboBoxItem Content="02" />
   <Label Content="####"/>
   <Label Content="##"/>
   <CheckBox Content="new sheet" VerticalAlignment="Center" IsChecked="{Binding Prop[NewSheet].Value}" Margin="20,0,0,0"/>

Now we need a function that will do the work. I’ve created this function, which could be placed in a custom PS1 file (recommended) or for testing purpose in the default.ps1 (not recommended)

function GenerateNumber
  $numSchems = $vault.DocumentService.GetNumberingSchemesByType('Activated')
  $DrawingSchm = $numSchems | Where-Object { $_.Name -eq "DrawingNo" }
  $SheetSchm = $numSchems | Where-Object { $_.Name -eq "SheetNo" }
  $prefix = $Prop["Site"].Value +"-"+ $Prop["Discipline"].Value +"-"+ $Prop["Area"].Value
    $prefix += "-"+ $Prop["Dwg No"].Value
    $newNumber = $vault.DocumentService.GenerateFileNumber($SheetSchm.SchmID, @($prefix))
    $Prop["Sheet"].Value = $newNumber.Substring($prefix.Length+1)
    $newNumber = $vault.DocumentService.GenerateFileNumber($DrawingSchm.SchmID, @($prefix))
    $Prop["Dwg No"].Value = $newNumber.Substring($prefix.Length+1,4)
  $Prop["DocNumber"].Value = $newNumber

In the OnPostCloseDialog i then call the GenerateNumber function, like this:

function OnPostCloseDialog
  $mWindowName = $dsWindow.Name

When you restart Inventor, you should be able to see the result. Create a new file and save it. The number should be generated with sheet 01. Make a copy via the Data Standard Ribbon and check the „new sheet“ checkbox, and the number remains the same, but the sheet number gets incremented.

In summary, we removed/replaced the VDS numbering with a custom numbering generation. Here we can feed the number generator with properties from the dialog and have all the freedom.

I hope you enjoy!

Posted in Data Standard | 1 Comment