Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Search our knowledge base, curated by global Support, for answers ranging from account questions to troubleshooting error messages.
This article explains how to import and export master items to and from a Qlik Sense app using the Microsoft Excel connector in Qlik Application Automation.
Content:
The first part of this article will explain how to export all of your master items configured in your Qlik Sense App to a Microsoft Excel sheet. The second part will explain how to import those master items from the Microsoft Excel sheet back to a Qlik Sense App.
For this, you will need a Qlik Sense app in your tenant that contains measures, dimensions, and variables you want to export. You'll also need an empty Microsoft Excel file. The image below contains a basic example on exporting master items.
The following steps will guide you through recreating the above automation:
An export of the above automation can be found at the end of this article as Export master items to a Microsoft Excel sheet.json
For this example, you'll first need a Microsoft Excel file with sheets configured for each master item type (dimensions, measures, and variables). Use the above example to generate this file. The image below contains a basic example on importing master items from Microsoft Excel to a Qlik Sense app.
An export of the above automation can be found at the end of this article as Import master items from a Microsoft Excel Sheet.json
Follow the same steps to build automations that import/export dimensions and variables.
Let's go over some edge cases when exporting information to Microsoft Excel:
Please check the following articles for more information about working with master items in Qlik Application Automation and also uploading data to Microsoft Excel.
Follow the steps provided in this article How to import & export automations to import the automation from the shared JSON file.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
In my recent Getting SaaSsy with DataRobot post, I documented how to use the DataRobot Analytics Connector from within your Qlik Sense applications. I know this will sound crazy but what if you want to make predictions on data you aren't loading into your application? Maybe you are collecting input parameters in your application from end users to play what-if games. Maybe you will record the predictions but you also want to take immediate action based on their values (ie prescriptive analytics.) Well, those things sound like perfect use cases for Qlik ApplicationAutomation.
It gets better my friends. Whether you are using a dedicated on-premise DataRobot server, a dedicated tenant or you are on the leading edge path with DataRobot's shiny new AI Cloud Manager using Paxata, Qlik Application Automation has you covered, and so do I.
In this post, I will help you identify the right DataRobot Connector Block to use for your path, help you understand how to execute predictions, and help you understand what to do with the output from the predictions.
You have already chosen your DataRobot path, now it's just a matter of choosing the correct block from the DataRobot Connector. You probably would have guessed from the elaborate way I described the DataRobot choices which Qlik Application Automation block goes to which environment. But to be sure ... If you have a dedicated DataRobot server, or you have a dedicated tenant, you should use the List Prediction Explanation from Dedicated Prediction API block. If you are using the DataRobot AI Manager environment with Paxata, you should use the List Predictions block.
Oh no! What's that you are saying? You weren't told which path your organization chose, you were just given credentials and you just log in. Don't sweat it. I can help you with that. Just go to your Deployments within DataRobot, choose the Deployment you are going to execute predictions against, and choose Predictions, Prediction API and Real-time. DataRobot will provide all of the clues we need to choose the right block.
If the API URL contains App2.datarobot.com like in the first image below you are working with their AI Cloud and will need to use the List Predictions block. However, if you see a Dedicated path in your API_URL such as Qlik.orm.datarobot.com (second image) you will need to use the List Predictions fro Dedication Prediction API.
There are some other clues above as well. Notice in the second image the rest of the URL path contains api/v2/deployments, while the second image contains preApi/v1.0/deployments. It's basically DataRobot telling you which of their API's you need to utilize.
So how will that help you know for sure? One of the things that many people seldom look at with Qlik Application Automation blocks is the Description. Simply drag either/both of the blocks onto your canvas and scroll all the way down in the right panel. If you look at the List Predictions for Dedication Prediction API description you will see the following and notice it clearly indicates preApi/v1.0/deployments.
However, if you press the Show API endpoint link for the List Predictions block it will look like this. Both are dead giveaways as to the block/path you should choose.
Regardless of which block you are using, you will first need to create a Connection for Qlik Application Automation to your DataRobot environment. If you have a Dedicated server your Connection details will look like this. Notice that you will need to copy the api_key value right from the DataRobot Deployment Details:
Your DataRobot AI Cloud connection will look similar and again you would need to copy your api_key from the deployment details. My DataRobot AI Cloud is just a "trial", hence I chose that region, while my dedicated tenant (above) is the US. The biggest difference is in the domain. Dedicated connections will be app.datarobot.com and AI Cloud connections will be app2.datarobot.com:
Once you test/save the connections we are ready to start making predictions. The List Predictions block is the easiest to set up so we will start with that one. Simply click the drop-down in the Deployment Id field, press "Do Lookup" and then choose the specific deployment model you are going to be making predictions against.
Then you simply provide the Input data you want to pass to the deployment to have predictions made for. More about the Prediction Data later, but for now notice that I've simply hard-coded a JSON string with field/value pairs:
The List Predictions fro Dedication Prediction API requires us to do a few more things that need to be completed. The first is the Dedicated Prediction Url. Good thing I had you bring up your deployment details because we will just copy it. Notice you do not need the HTTPS:// or the /predApi.. text, just the actual URL information.
Next will again simply click the drop-down for the Deployment ID, click Do Lookup and then choose our desired deployment.
Next, we copy the Data Robot Key from our deployment details and then we can insert our JSON block. Again, more later about that so don't panic in thinking I'm suggesting that you hand code the values you want to predict. It's just to make this section easier to navigate. 😁
Qlik Application Automation provides 4 additional parameters that are part of the DataRobot API specification. You can define the Passthrough Columns, Passthrough Columns Set, turn on Prediction Warnings and set the Decimals Number format.
You can refer directly to the DataRobot API documentation for all of the details you wish. For instance, notice that I have the Prediction Warning Enabled set to "true." Getting warnings sounded like a good idea. But alas, I ended up with an error.
Well, it turns out that in order to utilize the Prediction Warning Enabled there is work that must be done on our Deployment within DataRobot.
I guess I could have saved myself the trouble had I read the documentation. Oh well, I simply changed my default back to false so that the prediction can run.
Above I simply demonstrated the JSON format you need for your Prediction Data with hard-coded values. I've used my DataRobots and have predicted with a 99.99999999% confidence level that your goal in reading this isn't to hardcode 50+ input values each time you want a prediction. Instead, that data will come from somewhere else. Which is perfectly ok. Maybe you will be pulling the values from some other system as part of a workflow. When A event triggers this Qlik Application you will go do B and C and then assign output from those things to Variables that you will use as the Prediction Data. That's a great plan ... simply choose your variables, and use them where you need them in your Prediction Data. Notice I have already assigned the MasterPatientID variable and am in process of choosing the race variable below.
I'm so sorry. You don't like to use values, and you weren't doing A, B and C you were actually firing a SQL Query live based on input to your workflow and you wanted to use the data from the SQL Query. That is brilliant. Pulling the live data, when whatever event you have chosen triggers the Automation. You should write some posts. No problem, Qlik Application Automation will absolutely allow you to do that.
Or perhaps you are using a writeback solution, like Inphinity Forms, within a Qlik Sense application to capture input parameters and you wish to use those values. Do that.
Or perhaps you are ... You get the point. The Prediction Data simply needs to be a JSON block containing the field/value pairs. How you construct it, or read it from an S3 bucket, or pull it out of thin air doesn't matter. Which is the beauty of working with DataRobot within Qlik Application Automation.
Woohoo, you now have a block that will execute a deployment in your DataRobot environment, regardless of which kind, and we are now ready for those wonderful predictions. Perhaps the first thing you noticed about the blocks List Predictions and List Predictions from Dedicate Prediction API was that they start with List as opposed to Get. It's of course because you may be passing a single row of data as Prediction Data or you could be passing many items in the JSON block. So these blocks are handled as lists, even if it is just a list of 1 prediction.
The DataRobot Connector for loading data into our applications simply returns the Prediction value, which is 0, in this case (the patient is not predicted to be readmitted.) However, notice below that within Qlik Application Automation either prediction block will return the Prediction as well, but it will also return a list of the Prediction values and the scores for each possible value. In my case, the 1, likely to be readmitted was scored at 0.0428973004, whereas the 0, the Prediction, was scored at .571026996.
Who cares?
Well, maybe you do. As I started this post I mentioned that perhaps we want to take action(s) based on the predictions which might be why you are making the prediction in your Qlik Application Automation workflow instead of just making the prediction in a Qlik Sense Application. If we are writing a flow that is "prescriptive" we might want to check the values. Ooooh .49999999999 vs .500000000001. Maybe that will be Action A, just email someone. While .000000001 vs .999999999 tells us that it's safe to go ahead and take the really expensive Action Z. So we might want to set up a Conditional expression.
Regardless of what we do with the values, Qlik Application Automation allows you to simply choose the values right from the block, just like it allowed you to choose Variables or data from another source.
If you don't already know me I will bring you up to speed quickly. I have very defined boundaries and am really particular about how things are worded. For example, take the phrase Data Science. Well, Science is explainable. Therefore, if something isn't explainable it isn't science. And if your predictions aren't explainable, then that isn't Data Science, it's just Data. One of the key reasons you are likely using DataRobot is the fact that it can so wonderfully return explanations for its predictions.
The Prediction of 0 above is nice. But knowing what factors led to the prediction may be just as valuable when helping us choose our prescriptive actions. Well, my friends, Qlik Application Automation has you covered for that scenario as well. In fact, you can see from the following image the block is literally raising its hand and begging you to choose it. List Prediction Explanation from Dedicated Prediction API will give you not only the Prediction, and the Prediction values but it will also return the explanations to you.
Wait something must be wrong. I see a qualitativeStrength of +++, but the second is --. What do those mean? Oh yeah, now I remember ... Qlik Application Automation is just calling the provided DataRobot API's so I might as well check the documentation from DataRobot so that I get a full and complete understanding of the input paramters I can choose for the block and understand the output values. Sure enough, it's covered.
https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/dep-predex.html
I see you out there on the leading edge doing Time Series Predictions in DataRobot. Not an issue, Qlik Application Automation has you covered with a block as well. Simply choose the List Time Series Predictions from Dedicated Prediction API and you will be good to go.
The initial inputs needed are already covered above. However, there are a few additional parameters you will need to input as well.
Of course, DataRobot has you covered with complete documentation at https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/time-pred.html
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Getting SaaSsy with Data Robot
Lions and Tigers and Reading and Writing Oh My
This article explains how the Amazon SNS connector in Qlik Application Automations can be used to set up webhooks that trigger when a object creation event occurs in Amazon S3. This connector only has webhooks available.
Content:
Search for the "Amazon SNS" connector in Qlik Application Automations. When you click connect, you will be prompted for the following input parameters:
You must obtain the AWS Access Key from IAM in your AWS console. This can be obtained by going to the IAM section in AWS and in the left side panel choose users.
Here you can choose either for an already existing user or creating a new one by clicking the "Add Users" button in the topright.
When you create a new user, you must provide a user name and click next. You do not need to give this user access to the AWS console. In the next step, you will give permissions to do this IAM user.
The following policy needs to be created and attached to the IAM user, replace the account-id with your account ID:
Other permissions that are suggested to add are:
Furthermore, the IAM user must be made an owner of a S3 bucket when creating a notification configuration.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutBucketNotification",
"iam:PassRole",
"sns:Publish",
"sns:CreateTopic",
"sns:Subscribe"
],
"Resource": [
"arn:aws:s3:::*",
"arn:aws:iam::account-id:role/*",
"arn:aws:sns:*:account-id:*"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "sns:Unsubscribe",
"Resource": "*"
}
]
}
You will have to create an access key for the IAM user.
This can be done in the (a) Users menu and in the (b) Security credentials tab. Click (c) Create access key.
Choose Third-party service and choose to understand the above recommendation, click next:
You will now have your access key and secret key and can finish creating the datasource in Qlik Application Automation:
You can use this in an automation, but only as a webhook. When you create a new automation, you will be presented with a blank canvas. Select the Start block and change the run mode to webhook.
Choose an event type next. These are currently limited to S3 object creation events. You will have lookup capabilities available to other parameters, such as bucket and topic selection:
After saving the automation, you can test the webhook by uploading objects in your S3 bucket and see in your automation run history that it is triggering this automation.
The use of this is that you can now trigger tasks after a object is uploaded to S3. Common tasks will be to reload a Qlik Sense app or trigger a data pipeline in any of our other connectors:
This article describes how to resolve the NPrinting connection verification error:
x Qlik NPrinting webrenderer can reach Qlik Sense hub error
This article provides an overview of how to manage users using Qlik Application Automation. This approach can be useful when migrating from QlikView, or Qlik Sense Client Managed, to Qlik Sense Cloud when security concerns prevent the usage of Qlik-CLI and PowerShell scripting.
You will find an automation attached to this article that works with the Microsoft Excel connector. More information on importing automation can be found here.
Content
In this example, we use a Microsoft Excel file as a source file to manage users. A sheet name, for example, Users, must be added and this must also be provided as input when running the automation. The sheet must also contain these headers: userId, Name, Subject, Email, Roles, Licence, and Flag.
Example of sheet configuration:
If users are to be created the Flag column must be set to create. If users are to be deleted, there's no need to include roles, but Flag must be set to delete.
Add the List Rows With Headers block from the Microsoft Excel connector to read the values that have been configured in the Excel sheet.
When running the automation you must provide input to the automation, this includes the name of the worksheet to read data from. You also need to specify the first and last cell to read data from, as well as if users are to be created or deleted. Example :
Input | Value |
Worksheet Name | Users |
Excel Start Cell | A1 |
Excel End Cell | G5 |
Mode | Create |
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
How to manage space membership (users)
Installing, upgrading, and managing the Qlik Cloud Monitoring Apps has just gotten a whole lot easier! With two new Qlik Application Automation templates coupled with Qlik Data Alerts, you can now:
The above allows you to deploy the monitoring apps to your tenant with a hands-off approach. Dive into the individual components below.
Content:
This automation template is a fully guided installer/updater for the Qlik Cloud Monitoring Applications, including but not limited to the App Analyzer, Entitlement Analyzer, Reload Analyzer, and Access Evaluator applications. Leverage this automation template to quickly and easily install and update these or a subset of these applications with all their dependencies. The applications themselves are community-supported; and, they are provided through Qlik's Open-Source Software GitHub and thus are subject to Qlik's open-source guidelines and policies.
For more information, refer to the GitHub repository.
Note that if the monitoring applications have been installed manually (i.e., not through this automation) then they will not be detected as existing. The automation will install new copies side-by-side. Any subsequent executions of the automation will detect the newly installed monitoring applications and check their versions, etc. This is due to the fact that the applications are tagged with "QCMA - {appName}" and "QCMA - {version}" during the installation process through the automation. Manually installed applications will not have these tags and therefore will not be detected.
This template is intended to be used alongside the Qlik Cloud Monitoring Apps for user-based subscriptions template. This automation provides the ability to keep the API key and associated data connection used for the Qlik Cloud Monitoring Apps up to date on a scheduled basis. Simply input the space Id where the monitoring_apps_REST data connection should reside, and the automation will recreate both the API key and data connection regularly. Ensure that the cadence of the automation’s schedule is less than the expiry of the API key.
Enter in the Id of the space where the monitoring_apps_REST data connection should reside.
Ensure that this automation is run off-hours from your scheduled monitoring application reloads so it does not disrupt the reload process.
Each Qlik Cloud Monitoring App has the following two variables:
With these variables, we can create a new Qlik Data Alert on a per-app basis. For each monitoring app that you want to be notified on if it falls out of date:
Here is an example of an alert received for the App Analyzer, showing that at this point in time, the latest version of the application is 5.1.3 and that the app is out of date:
Q: Can I re-run the installer to check if any of the monitoring applications are able to be upgraded to a later version?
A: Yes. Run the installer, select which applications should be checked and select the space that they reside in. If any of the selected applications are not installed or are upgradeable, a prompt will appear to continue to install/upgrade for the relevant applications.
Q: What if multiple people install monitoring applications in different spaces?
A: The template scopes the applications install process to a “target” space, i.e., a shared space (if not published) or a managed space. It will scope the API key name to `QCMA – {spaceId}` of that target space. This allows the template to install/update the monitoring applications across spaces and across users. If one user installs an application to “Space A” and then another user installs a different monitoring application to “Space A”, the template will see that a data connection and associated API key (in this case from another user) exists for that space already and it will install the application leveraging those pre-existing assets.
Q: What if a new monitoring application is released? Will the template provide the ability to install that application as well?
A: Yes. The template receives the list of applications dynamically from GitHub. If a new monitoring application is released, it will become available immediately through the template.
Q: I would like to be notified whenever a new version of a monitoring applications is released. Can this template do that?
A: As per the article above, the automation templates are not responsible for notifications of whether the applications are out of date. This is achieved using Qlik Alerting on a per-application basis as described in Part 3.
Q:I have updated my application, but I noticed that it did not preserve the history. Why is that?
A: The history is preserved in the prior versions of the application’s QVDs so the data is never deleted and can be loaded into the older version. Each upgrade will generate a new set of QVDs as the data models for the applications sometimes change due to bug fixes, updates, new features, etc. If you want to preserve the history when updating, the application can be upgraded with the “Publish side-by-side” method so that the older version of the application will remain as an archival application. However note that the Qlik Alert (from Part 3) will need to be recreated and any community content that was created on the older application will not be transferred to the new application.
It is possible to export the list of tenatnt users to a .json file using the "user ls" command from the Qlik Command Line Interface (qlik-cli).
The scripts provided in this article are provided as they are and they are for guidance only.
As a tenant admin, download and configure the Qlik-cli
qlik user ls --limit 1000 > tenantusers.json
[
{
"assignedGroups": [],
"assignedRoles": [
{
"id": "608050f7634644db3678b1a2",
"level": "user",
"name": "Developer",
"type": "default"
},
{
"id": "608050f7634644db3678b17f",
"level": "admin",
"name": "TenantAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af862",
"level": "user",
"name": "SharedSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af866",
"level": "user",
"name": "ManagedSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af86b",
"level": "user",
"name": "DataSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85d",
"level": "admin",
"name": "AnalyticsAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85f",
"level": "admin",
"name": "DataAdmin",
"type": "default"
},
{
"id": "63580b8d5cf9728f19217be0",
"level": "user",
"name": "PrivateAnalyticsContentCreator",
"type": "default"
},
{
"id": "6356f0425cf9728f1962b942",
"level": "user",
"name": "DataServicesContributor",
"type": "default"
}
],
"created": "2020-05-18T09:38:29.214Z",
"createdAt": "2020-05-18T09:38:29.214Z",
"email": "martina.testoni@dkdaklaldkdaklladaaddddl.com",
"id": "USERID1",
"lastUpdated": "2023-04-04T07:32:00.756Z",
"lastUpdatedAt": "2023-04-04T07:32:00.756Z",
"name": "Martina Testoni",
"picture": "https://s.gravatar.com/avatar/gravatarimage=pg\u0026d=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fdp.png",
"preferredLocale": "",
"preferredZoneinfo": "Europe/Copenhagen",
"roles": [
"Developer",
"TenantAdmin",
"SharedSpaceCreator",
"ManagedSpaceCreator",
"DataSpaceCreator",
"AnalyticsAdmin",
"DataAdmin",
"PrivateAnalyticsContentCreator",
"DataServicesContributor"
],
"status": "active",
"subject": "auth0|SUBJECTID2",
"tenantId": "TENANTID"
},
{
"assignedGroups": [],
"assignedRoles": [
{
"id": "608050f7634644db3678b17f",
"level": "admin",
"name": "TenantAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af86b",
"level": "user",
"name": "DataSpaceCreator",
"type": "default"
},
{
"id": "608050f7634644db3678b1a2",
"level": "user",
"name": "Developer",
"type": "default"
},
{
"id": "605a1c2151382ffc836af866",
"level": "user",
"name": "ManagedSpaceCreator",
"type": "default"
},
{
"id": "63580b8d5cf9728f19217be0",
"level": "user",
"name": "PrivateAnalyticsContentCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af862",
"level": "user",
"name": "SharedSpaceCreator",
"type": "default"
},
{
"id": "6356f0425cf9728f1962b95c",
"level": "user",
"name": "Steward",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85d",
"level": "admin",
"name": "AnalyticsAdmin",
"type": "default"
},
{
"id": "62bb165356d1879582c1b468",
"level": "admin",
"name": "AuditAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85f",
"level": "admin",
"name": "DataAdmin",
"type": "default"
}
],
"created": "2023-03-31T08:44:37.332Z",
"createdAt": "2023-03-31T08:44:37.332Z",
"email": "Gentile.Faccenda@dkdaklaldkdaklladaaddddl.com",
"id": "USERID2",
"lastUpdated": "2023-04-03T11:24:35.037Z",
"lastUpdatedAt": "2023-04-03T11:24:35.037Z",
"name": "Gentile Faccenda",
"picture": "https://s.gravatar.com/avatar/randomurl=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fdp.png",
"roles": [
"TenantAdmin",
"DataSpaceCreator",
"Developer",
"ManagedSpaceCreator",
"PrivateAnalyticsContentCreator",
"SharedSpaceCreator",
"Steward",
"AnalyticsAdmin",
"AuditAdmin",
"DataAdmin"
],
"status": "active",
"subject": "auth0|IDPSUBJECT2",
"tenantId": "TENANTID"
}
]
qlik user ls --limit 1000 | ConvertFrom-Json | ConvertTo-Csv > tenantusers.csv
Note: we've recently (May 25th) released a new version of the Snowflake connector, if you had automations using Snowflake prior to that date, the connector will show as Snowflake - deprecated. To use this new version, simply replace those blocks with blocks from the current Snowflake connector.
This article gives an overview of the available blocks in the Snowflake connector in Qlik Application Automation. It will also go over some basic examples of retrieving data from a Snowflake database and creating a record in a database.
This connector has the following blocks:
To create a new connection to Snowflake, the following parameters are required:
The Do Query block can be used to perform actions in Snowflake that aren't supported by the other blocks. See the below example on creating a new table.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This article explains how a list can be used as values for the Add Selection To Report and Add Selection To Sheet blocks in the Qlik Reporting connector in Qlik Application Automation.
You might have noticed that the Values input field in these blocks only allows you to specify values one by one. But in some scenarios, you'll want to specify a list of field values instead of adding them one by one.
If you're new to reporting, please read our Reporting tutorial first.
The source of these values can either be the List Values Of Field block from the Qlik Cloud Services connector or any List ... block from a 3rd party storage tool like Microsoft Excel. In this example, we'll use the List Values Of Field block.
The example automation used in this article looks like this:
And this is what the example output of the List Values Of Field block looks like:
In this case, the qText parameter is required as the value to make selections. Go to the Add Selection To Report block, make sure to specify the same field name as the one used in the List Values Of Field block, and enable the "Raw input" mode:
Remove the square brackets from the input field and click it to select the "Output from List Values Of Field" as the input for the Values input field:
This will take you to the output of the List Values Of Field block and then click the qText parameter. And choose "Select all qText(s) from list ListValuesOfField" in the next screen.
That's it! When the automation now runs, a list of strings is mapped as the value for this selection. You can verify this by toggling the view mode in the automation's chronological output view:
If you want to use multiple selections for this report, add additional Add Selection To Report or Add Selection To Sheet blocks.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Currently, in Qlik Application Automation it is not possible to export more than 100,000 cells using the Get Straight Table Data block.
Content:
To overcome this limit, the workaround is to export records in batches from the Qlik Sense straight table to the cloud storage platform of your choice. The prerequisite is to have a unique numerical field in your dataset. If you don't have the unique field in your dataset, you can add it using RowNo() function in the load script as shown below. This counts the rows in the dataset.
In this example, we will export data from the Qlik Sense straight table to Dropbox as a CSV file.
You can also find an exported version of this automation and application attached to this article. More information on importing automation can be found here.
With Salesforce Jobs API, you can insert, update, upsert, or delete large data sets. Prepare a comma-separated value (CSV) file representation of the data you want to upload, create a job, upload job data, and let the Qlik Application Automation handle these with the Salesforce API.
Here are the steps to use the Upload Jobs APIs:
This session addresses:
-Understanding new SaaS capability
-How to get started
-Troubleshooting common issues
00:00 - Intro
01:08 - What is Qlik Reporting Service
01:45 - Difference from NPrinting
04:28 - 1st: Sheet Size should match Paper Size
05:32 - 2nd: Reports based on Public Sheets
06:18 - Creating an Automation from Templates
07:30 - Creating Report Automation from scratch
10:18 - Previewing the Report File Size
11:02 - Troubleshooting automation workflow
11:50 - How to skip a block
12:25 - Reviewing Automation logs in QMC Catalog
13:02 - Identifying and correcting errors
13:43 - Information to create a Support Case
14:34 - Limitations
15:18 - QnA: Can you see the file size of a report?
15:56 - QnA: Can you see how many times a report is downloaded?
16:18 - QnA: How does this compare to NPrinting?
16:59 - QnA: Does this work with QlikView?
17:08 - QnA: Can any app be used to generate a report?
17:32 - QnA: Can Reporting be added to a button?
18:12 - QnA: How can you preview the report?
18:55 - QnA: Email attachment or file storage for download?
19:31 - QnA: Is it available for QSEoW?
19:57 - QnA: Rights to generate reports?
20:10 - QnA: Possible to use different report colors than in App?
20:34 - QnA: Possible to adjust font size or resolution?
Resources:
Triggering an automation from a button on a sheet
Help page documentation - Limitations
Qlik Application Automation Info Checklist
Q&A:
Q: How does this compare to NPrinting?
A: NPrinting is a fully developed, on-premises product for QlikView and Qlik Sense, which offers a wider range of report formats besides pdf.
Q: How much of NPrinting functionality does Reporting Services cover and how is the roadmap?
A: Qlik Reporting Service is not a replacement of NPrinting; it is a new reporting service in Qlik Cloud. More features will be added. We have a dedicated forum: Reporting Service
If you have any ideas to improve the service, please submit it as feature request: Ideas
Q: How to send out emails with Excel reports? pdf is fine.
A: The current format option is pdf only.
Q: Is this available on Qlik Forts?
A: No, Qlik Reporting Service cannot access data stored in Qlik Forts.
Q: Can we deliver a report as a spreadsheet? If not, is that a feature for the future?
A: You can accomplish it by using Qlik Application Automation: Using Qlik Application Automation to create and distribute Excel reports in Office 365
Q: Does this apply to QlikView, as well?
A: QlikView is not supported.
Q: Does it matter how many people are distributed to in an automation "run"?
A: No limit on the number of report recipients, but automation jobs have limitations such as the duration of execution. Qlik Application Automation limitations
Q: If we purchase the additional license there will be another additional limitation?
A: It would be up to its license. For further information, please contact your account manager.
Q: Does QRS use our organizations SMTP Server? If so, is the send rate customizable, one of the issues with NPrinting is that the send rate exceeds the allowable by Office365?
A: Any SMTP server should work.
Q: Isn't it possible to combine content from different applications/tabs?
A: A report is generated from a single app.
Q: Is report automation intended for normal users of the application? If so, what security rules are needed to enable this capability?
A: You can only make a report automation with Qlik Sense apps you have permission to access.
Q: Can't the report filter data for each different user that is going to receive it?
A: Yes, you can. Please watch this demo: Did you say Report Bursting? Show me more!
Q: Could you please let me know how can we migrate our Qlik Sense apps from Qlik Sense Enterprise on Windows as well as QlikView apps to Qlik Sense SAAS?
A: Please refer to our help site: Moving from client-managed Qlik Sense to Qlik Sense SaaS
Q: Will the generated reports look exactly like the sheet export to pdf? What happens to elements like tables that don't fit the screen? As far as I remember, they are simply cut in the pdf extract but can flow over to a next page in NPrinting.
A: Please leverage Manual download in PDF. The preview shows you a final outcome in advance.
Q: What about the 100 reports mentioned on the price list?
A: The limitation are listed here: Qlik Reporting Service specifications and limitations
Limitations have updated with the 4th of April, 2024. See Reporting Service Packaging Changes.
Q: Is it possible to add filters based on the different user when distributing the report, like NPrinting, Example John Doe only wants to see Country US, but Jane Doe want to see US & Canada?
A: Yes, you can. Please watch this demo: Did you say Report Bursting? Show me more!
Q: What NPrinting functions cannot be done in Qlik Reporting? My question is: why the client usually makes those evaluations about what functionality he will not have anymore if he has to choose to change from NPrintng to Qlik Repoprting?
A: Qlik Reporting Service is a part of the Qlik Application Automation Connector, which facilitates the report distribution tasks. While NPriting is a separate on-premises product with more mature features. Having said that, more features on the road map.
Q: Where can you send the files to?
A: You can send a report as attachment or a public storage using the following connectors: Amazon S3, Dropbox, FTP, Google Cloud Storage, SFTP.
Q: Hi, about launch an automation from a Qlik Sense app, which rights are needed so users can launch this automation or report?
A: You don’t need a specific right unless your tenant admin disable the Application Automation.
Q: Is there a limit of the size of the PDF that can be generated?
A: No file size limit, there are other factors you may want to consider. Please see the limitations: Qlik Reporting Service specifications and limitations
Q: When will other file formats (for example, PowerPoint) be supported?
A: They are on road map, so please stay tuned.
Q: Can you use only complete sheets or also specific diagrams?
A: Yes, as long as they are public sheet. You may want to consider the size of the sheet; otherwise, the sheet may not fill a report well.
Q: Can you repet please how much reports can we send for free?
A: Number of reports per tenant per day: 10,000.
Q: Can you "print" a whole Table? Or only the first 20 Rows, which are seen?
A: You can make a selection in a block and filter it. Add a selection to the report
Q: Hi, you talk about 5000 free runs : it is about automation or report number ?
A: The limitation are listed here: Qlik Reporting Service specifications and limitations
Limitations have updated with the 4th of April, 2024. See Reporting Service Packaging Changes.
Q: Can you use filter?
A: Yes. Creating a Qlik Reporting Service report
Q: Are there plans to be able to send Excel format as the report?
A: Application Automation can distribute data in a Excel format: Using Qlik Application Automation to create and distribute Excel reports in Office 365
Q: Can you add the company graphical profile?
A: If you can add it as image on a sheet, then yes.
Q: In the future will be possible to generate other report formats such as PowerPoint?
A: If you have an idea, please submit it in our Ideation: Ideas
Q: Can you include 3rd party chart types (e.g., Vizlib charts) in Qlik Reporting Service?
A: Yes, what you can see in the sheet will be in a report.
Q: If you need native reports as PPT you should use NP?
A: Yes, Qlik Reporting Service only generates a report in PDF.
Q: Do you have central place to monitor all automation/reporting tasks? To see errors, last execution time or status?
A: Tenant admin is able see all the automation on the list in Management Console.
Q: Any plans on bringing this to Qlik Sense Enterprise on Windows?
A: No, this is a feature in Qlik Cloud.
Q: Please forgive if this has already been covered as I joined late, can we select landscape or portrait paper format?
A: You can adjust a sheet size, which will be your report size.
Q: When do combination of apps in one report come?
A: Please submit your idea in our ideation page: Ideas
Q: Is multi-format reporting on the roadmap for QRS? or will multi-format reporting stay with NPrinting only? Do you believe that eventually QRS can complete all NPrinting tasks?
A: Qlik Reporting Service is a different feature in Qlik Cloud. If you are looking for a solutin in Excel format, you can use Application Automation to generate and distribute a Excel spreadsheet: Using Qlik Application Automation to create and distribute Excel reports in Office 365
Q: Can you use it for on-demand reporting?
A: You can trigger a report from the sheet by implementing an action button: Triggering an automation from a button on a sheet
Q: The actual limit of 100 report, is 100 runs or 100 reports? I mean, can I send one report to 200 users with 200 different filters?
A: The limitation are listed here: Qlik Reporting Service specifications and limitations
Limitations have updated with the 4th of April, 2024. See Reporting Service Packaging Changes.
Q: I heard about a limit of 100 free reports by tenant by months?
A: The limitation are listed here: Qlik Reporting Service specifications and limitations
Limitations have updated with the 4th of April, 2024. See Reporting Service Packaging Changes.
Q: Can you show us how you set the screen size of the report so that it matches the page layout?
A: You set the sheet size in the app, which will be your report. The reporting service will try to optimize for portrait/landscapes.
Click here for video transcript
This article provides an overview to get started with the OpenAI connector in Qlik Application Automation.
The OpenAI connector offers developers a range of powerful natural language processing capabilities. It allows for tasks such as text generation, translating between languages, analyzing sentiment, summarizing content, and building question-answering systems. These features enable you to bring additional value to your existing automations.
Content:
Create a new automation and search for the OpenAI connector in the block library on the left side. Drag a block inside the automation editor canvas, and make sure to select the block to show the block configuration menu on the right side of the editor. Open the Connect tab in the configuration menu and provide your OpenAI API key. Visit your API Keys page to retrieve the API key you'll use in your requests.
Once the connection to your OpenAI account has been created, you can start building an automation that uses the connector.
The available blocks are:
For more details on the API, please refer to the following link.
At the time of writing this article, the Images and Audio endpoints in the OpenAI API are in beta state but can be used through the Raw API Request blocks.
This use case is based on the existing template "Analyze support ticket sentiment with Expert.ai". In this template, Expert.ai is used to predict the sentiment of new support tickets from ServiceNow. If the sentiment is deemed too negative, the automation will send an alert to a Microsoft Teams channel to inform the support team about the incident. For convenience, we'll leave out the write to MySQL and app reload part of the original template.
If you want, you could also use OpenAI to predict the sentiment instead of Expert.ai. But keep in mind that this could provide a less accurate result since the Expert.ai.
Below are a couple of tips and limitations to keep in mind when working with the OpenAI connector in automations.
{
"error": {
"message": "You exceeded your current quota, please check your plan and billing details.",
"type": "insufficient_quota",
"param": null,
"code": null
}
}
Create completion block: The model parameter, the only required parameter in this block, allows us to produce a random answer. A minimum prerequisite for a more insightful response is a model, prompt, and Max Tokens input parameter. Other input variables may be employed to narrow down the response.
Write a summary of the following data.
<|endoftext|>
Country | Population
------- | --------
United States | 329.5 million
China | 1.444 billion
India | 1.38 billion
Completion:
This data shows the population of three of the largest countries in the world. The United States has a population of 329.5 million, China has a population of 1.444 billion, and India has a population of 1.38 billion.
Max Tokens: Specifies the maximum number of tokens in the generated completion response. More information about how the token count is calculated by OpenAI can be found here: Tokenizer.
Example:50
The following parameters are optional in most use cases but could be used to fine-tune the response:
0.6
top_p
(or nucleus
) selects the most likely tokens until the cumulative probability exceeds the threshold. 0.8
3
2
0.6
0.4
5
"12345"
The Amazon Bedrock connector is currently being updated to reflect the API endpoint change by Amazon. We expect the update to be completed on the 12th of March.
Amazon Bedrock is a fully managed service that makes base models from Amazon and third-party model providers accessible through an API.
This article explains how the Amazon Bedrock connector in Qlik Application Automation can be used within Qlik Cloud.
Content:
By default, users and roles don't have permission to create or modify Bedrock resources. They cannot perform tasks using the AWS Management Console, AWS Command Line Interface (AWS CLI), or AWS API. To grant users permission to perform actions on the resources that they need, an IAM administrator can create IAM policies. The administrator can then add the IAM policies to roles, and users can assume the roles.
To learn how to create an IAM identity-based policy by using these sample JSON policy documents, see Creating IAM policies in the IAM User Guide.
The Amazon Bedrock connector consists of the following blocks:
Application Automation Mail Block run history give the following error
Malformed UTF-8 characters
There is a problem with your email server provider.
Please use a different simple authentication email server mail box ie: Gmail (see link below)
Email server issues that must be rectified by the email administrator or by changing email service provider for Application Automation email delivery.
I cannot connect my SMTP server in the Qlik Applic... - Qlik Community - 1984572
I cannot connect my SMTP server in the Qlik Applic... - Qlik Community - 1984572
QB-24626
The Amazon Lambda connector allows Qlik Application Automation in Qlik Cloud Services to easily launch Lambda functions, which simplifies data-driven automation.
Amazon Lambda runs code without maintaining servers, is triggered by events, and is charged solely for the time it takes.
This article explains how the Amazon Lambda connector in Qlik Application Automation can be used within Qlik Cloud.
Content:
By default, users and roles don't have permission to create or modify Lambda resources. They cannot perform tasks using the AWS Management Console, AWS Command Line Interface (AWS CLI), or AWS API. To grant users permission to perform actions on the resources that they need, an IAM administrator can create IAM policies. The administrator can then add the IAM policies to roles, and users can assume the roles.
To learn how to create an IAM identity-based policy by using these sample JSON policy documents, see Creating IAM policies in the IAM User Guide.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "InvokeFunctions",
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction",
"lambda:InvokeAsync"
],
"Resource": "*"
},
{
"Sid": "ListFunctions",
"Effect": "Allow",
"Action": "lambda:ListFunctions",
"Resource": "*"
}
]
}
The Amazon Lambda connector consists of the following blocks:
The Amazon Lambda connectors allow calling Lambda functions, programmable for diverse tasks including:
The Amazon Lambda connector allows you to write back App data to the source database. If clients do not have access to their firm's database, the company may offer an Amazon Lambda function. We can utilize the Amazon Lambda connector to call this function, which will then update the database.
Refer to How to build a write back solution with native Qlik Sense components and Qlik Application Automation for instructions on how to configure the writeback. In the automation, instead of using the JIRA connector.
Use the Amazon Lambda connector's Invoke Function block, as illustrated:
Using Github and Github Actions, see Move hydrated apps between tenants with third-party tools on how to migrate apps between tenants. Lambda functions allow us to accomplish comparable functionality.
Replace the "Github" blocks in the automation detailed in the aforementioned post with the "Invoke Function" block.
Include the following logic in the Lambda function:
External libraries are not supported by automation; however, you can use the Lambda function if a sophisticated transformation needs to be performed during the automation process and external libraries are needed.
You can adhere to the following in Automation:
This article explains how the Qlik Reporting connector in Qlik Application Automation can be used to generate a bursted report that delivers recipient-specific data.
For more information on the Qlik Reporting connector, see this Reporting tutorial.
This article offers two examples where the recipient list and field for reduction are captured in an XLS file or a straight table in an app. Qlik Application Automation allows you to connect to a variety of data sources, including databases, cloud storage locations, and more. This allows you to store your recipient lists in the appropriate location and apply the concepts found in the examples below to create your reporting automation. By configuring the Start block's run mode, the reporting automations can be scheduled or driven from other business processes.
In this example, the email addresses of the recipients are stored in a straight table. Add a private sheet to your app and add a straight table to it. This table should contain the recipients' email address, name, and a value to reduce the app on. We won't go over the step-by-step creation of this automation since it's available as a template in the template picker under the name "Send a burst report to email recipients from a straight table".
Instead, a few key blocks of this template are discussed below.
In this example, the email addresses of the recipients are stored in an Excel file. This can be a simple file that contains one worksheet with headers on the first row (name, email & a value for reduction) and one record on each subsequent row.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.