Omniscope Evo allows the report authors to build smart data applications and remove multiple steps from the data journey. Omniscope dashboards can be augmented by the addition of 'custom views', containing executable elements, making the dashboard a true self-service reporting application.
One of the scenarios enables the report consumer to upload new data and see the changes instantly reflected in the charts, while the data is checked/ transformed in the background, before being added the visual report.
To empower the Omniscope report viewer to upload data files from the web dashboard interface and execute the workflow (without giving them access to the project workflow), you can follow the steps below.
The model is attached and you can either upload the ioz file (and customise it on your end) or build it from scratch.
- To get started, create a dedicated folder location where users can upload their data files. Inside this folder, create a new Omniscope project.
- Add a text parameter, from the workflow's three dot menu, with the value that represents a path to the original data source e.g. /home/ubuntu/omniscope-server/files/File_Name.csv (alternatively - add a File source block and browse to this location, then copy/paste the path)
- in the File source block click on the parameter link in the Path option to connect it with this text parameter
- execute the block to extract data from the source file
- add a Validate data block and click on 'Reset to input' Schema builder option (every new dataset will be evaluated against this schema). Set the failure action to 'error', to prevent data with anomalies passing through the block.
- Optional - duplicate the Validate block and set the failure outcome to 'Warning', in order to generate a diagnostics report in situations where schema mismatches are found
- add a Batch append block and set the Path to your project's folder. This will allow Omniscope to 'pick up' and append any new data files to the existing records
- connect the batch source to a Field Organiser - this will allow you to add calculated fields or remove fields that are not needed
- add a Deduplicate block to create safeguards and prevent the same dataset accidentally being uploaded multiple times
- build the dashboard with visualisations and add a custom view called 'Upload and Execute' from the chart library . If this is the first time you're using this view, you may be prompted to download it. After that it will be added to your chart library and you'll be able to drag it to the report interface.
- insert the names of the blocks you wish to execute when the viewer clicks on 'Execute the workflow' - here just the 'Output' block. The option to 'Refresh this Report' is ticked and will refresh the dashboard every time.
- time to test the model: click on the 'Choose file' button and navigate to the new file on your machine, then click on 'Execute the workflow'. Check the record count on the blocks or the report barometer. The options chosen will allow you to follow the execution, and view the data anomaly report, should the new dataset fail the data quality test.
Note: This is just a suggestion how to build the upload data model - you can create variations, but will still need to stick to some basics like the parameter link and the file path location.
Important: this setup relies on the Scheduler application execution behind the scenes, therefore will require a Business license or Enterprise.