Skip to main content
Announcements
YOUR OPINION MATTERS! Please take the Qlik Experience survey you received via email. Survey ends June 14.
cancel
Showing results for 
Search instead for 
Did you mean: 
coleturegroup
Contributor III
Contributor III

Store a variable to a file for use via $(Must_Inclde) in other Apps

I'm attempting to standardize reusable scripting across various dashboards, later in those Apps requesting this type of info would use a must include statement for consistency across Apps.. Here is my failed approach:

Set vVariableName= "

Table1: 

load * from Tble1 qvd;     // Suito code 🙂

Table2:

load * from Tble2 qvd; 

";

store $(vVariableName) into lib://FolderConnection/myVariablefile.txt (txt);

The problem seems to be here in the store command, it doesn't like the $(vVariableName) . 

Any ideas?

Labels (1)
1 Solution

Accepted Solutions
marcus_sommer

It seems that here different things are mixed - means the creation and the call of include-variables. If I understand your doing right you tries to put a load-statement to a variable and tries then to store this variable as a txt-file. But this isn't possible because a variable couldn't be externally stored - only tables. This means you need to load the variable into a table, maybe something like this:

Set vVariableName= "

Table1: 

load * from Tble1 qvd;  

Table2:

load * from Tble2 qvd; 

Table3:

load * from Tble3 qvd; 

";

t: load '$(vVariableName)' as [/* load of xxx */] autogenerate 1;
store t into t.txt (txt);

But I'm not sure if such an approach is really very practically because if you write a load-statement why not copy & paste directly into the txt-file? Personally I do it mostly in this way: developing the script in QlikView, testing there the results and if it worked like expected and it's really reusable between multiple logic and therefore be worth to be centralised and do then the copy & paste.

Beside this I have some doubts that's expedient to share (parts of) data-models in this way. Reasons for it are the static content of these includes (of course there are ways to get it dynamically by using sub-routines and/or further variables - but this creates more complexity for your intended end-users as you skipped with the include-variables), each other content within within the calling script would have an impact and caused therefore risks that anything gets wrong. I assume it's aimed like a light switch with on/off and nothing could be wrong - but that's not true. It will depend ...

Therefore I suggest to provide to the applications-developer the qvd-layer directly and they are skilled enough to choose the wanted ones and/or you provides already final data-models which could be used directly and/or per binary-load from the reporting-layer with/without section access and/or on-top adjustments after the binary load. 

- Marcus

 

View solution in original post

3 Replies
Vegar
MVP
MVP

what are you trying to achieve with your script. 

Right now it looks like you are trying to run this statement.

Store

Table1: 

load * from Tble1 qvd; // Suito code 🙂

Table2:

load * from Tble2 qvd;

 into lib://FolderConnection/myVariablefile.txt (txt);

 

It is not a valid script.

 

If you explain what you are trying to achieve then someone in the community can help you sort out what you need to do.

coleturegroup
Contributor III
Contributor III
Author

Hmmmm, ok lets try this again. Sorry for any confusion. Agreed it is not valid script code, it's an example i'm using to tell a story. What i'm trying to show is a reuse of tested script code that can be distributed to other developers via include statements that consumes tested data in focused areas. An example might be script that combines patient information from multiple tables, or lab information  from multiple tables,....

So with that said, i want multi-table load table script information in one variable, store contents of that variable in a txt file to be included by other developers. notice the Set vVariableName= " ...... "; holding the script below assuming the script will be valid. 

Set vVariableName= "

Table1: 

load * from Tble1 qvd;  

Table2:

load * from Tble2 qvd; 

Table3:

load * from Tble3 qvd; 

";

 Then anyone needing patient information doesn't have to know what tables they need to pull, they only need to include the txt file and all tables required will be loaded in there script to load on demand.

marcus_sommer

It seems that here different things are mixed - means the creation and the call of include-variables. If I understand your doing right you tries to put a load-statement to a variable and tries then to store this variable as a txt-file. But this isn't possible because a variable couldn't be externally stored - only tables. This means you need to load the variable into a table, maybe something like this:

Set vVariableName= "

Table1: 

load * from Tble1 qvd;  

Table2:

load * from Tble2 qvd; 

Table3:

load * from Tble3 qvd; 

";

t: load '$(vVariableName)' as [/* load of xxx */] autogenerate 1;
store t into t.txt (txt);

But I'm not sure if such an approach is really very practically because if you write a load-statement why not copy & paste directly into the txt-file? Personally I do it mostly in this way: developing the script in QlikView, testing there the results and if it worked like expected and it's really reusable between multiple logic and therefore be worth to be centralised and do then the copy & paste.

Beside this I have some doubts that's expedient to share (parts of) data-models in this way. Reasons for it are the static content of these includes (of course there are ways to get it dynamically by using sub-routines and/or further variables - but this creates more complexity for your intended end-users as you skipped with the include-variables), each other content within within the calling script would have an impact and caused therefore risks that anything gets wrong. I assume it's aimed like a light switch with on/off and nothing could be wrong - but that's not true. It will depend ...

Therefore I suggest to provide to the applications-developer the qvd-layer directly and they are skilled enough to choose the wanted ones and/or you provides already final data-models which could be used directly and/or per binary-load from the reporting-layer with/without section access and/or on-top adjustments after the binary load. 

- Marcus