How to create an Omniscope macro - reusable workflow logic

Modified on Wed, 08 Jun 2022 at 03:21 PM

Macro block functionality will allow you to ‘compress’ a group of data workflow blocks that perform a certain function in your project and 'package them' into a single block with reusable logic, that you can reference in other projects to avoid the need to rebuild the same data transformation multiple times. 

Another benefit is that you might be able to share these Macro blocks with your colleagues, who are connected to the same Team or Enterprise Evo installation.




Shared Macros


Here is a list of steps that you can follow to recreate the attached demo (consists of 2 IOZ files - 1 macro and 1 implementation), so you can import and test the functionality on your side.


1. Create a folder called "Macros" on the root of the folder that will contain the 'macro implementation' files


2. Create your macro-defining project inside the Macros folder. This project will contain the steps you wish to bundle and replace with a single macro block in your other projects. 


The demo is taking 2 fields and performing date ranking on multiple levels, ending up with 9 calculated fields.  This example has both macro input and output building components, defining where the automation starts and ends. Note that you may have a macro that contains just an input (to automate data extraction from a database, for example), or just a macro output, to automate publishing to a specific location.


3. Create a Data Validation block (this is an optional step), inside the same macro-builder project, to create a data schema - a blueprint of the macro's input data. Bookmark this block, so you can use it in the next project, for automated data screening. This step will ensure that the macro block is getting a correctly formatted dataset (field names, formats, cell values, record count etc., also that you can quickly diagnose the issues by using the Validate block's 'Problems' tab)


4. Implementation: create a new project in any folder inside the root folder and connect some input data to the bookmarked Validation block (from the Input block menu > Bookmars). If no issues are found - proceed with the specific Macro block. You will find it under the Shared Macro blocks section (it's worth giving it a short meaningful name, in case you end up with multiples!). 

Resulting dataset should contain the same data fields/formats as the macro project's output - in this demo case - 11 fields.




Editing your macro - if at any point you wish to make changes to your routine - go to the Macros project location and make edits. There is no need to execute this project, however you will need to re-execute the implementation projects for this change to have an effect.


Local Macros

In some cases the data transformation routine might be relevant for a small subset of your projects and you may wish to keep it out of the shared Macros menu.  You can achieve this and create a 'local macro', that isn’t visible from the Shared macros menu, by navigating to your macro's file location (this could be any folder you have access to). 




Parameters - you can remotely control execution of your macro block by inserting the Project parameters (in the Macro-building project) in one or more blocks with a param interface. These parameters will appear as 'options' in the macro implementation block e.g. to influence a database query, filter the records or fields etc.

The user who is referencing the macro block will be able to define these values in their own project, without affecting other instances of the specific Macro implementation.


Read more about Omniscope Macros in this article.

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article