Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

QVD file storing with less data on a random basis

Hi,

I am facing an issue while loading a script from oracle DB and storing into QVD. My QVD is getting stored with random data size on a frequent basis.

For eg: If yesterday the QVD size was around 4 GB, today the same QVD is getting stored by around 300 MB only, which is resulting in a huge volume of data loss.

A few days ago I received an issue, as less number of data rows were getting fetched from a DB table on different days (bug id 47189). Qliktech has already provided a patch fix for that.

But now, the QVD is getting stored with less data while the task is getting failed in QMC. The log file shows "script execution failed" after the sql statement and no store command is executed.

Could anyone please help me out in this case.

Here is the code I am using--

------------------------------------------------

ITEMDIM:
LOAD ID as ItemKey,
    Text(left(SN,8)) as ITEMTAC,
    Text(mid(SN,9,4)) as ITEMMIDDLE,
    Text(mid(SN,13)) as ITEMREST;

SQL SELECT *
FROM "IM_OWNER".ITEM;

store ITEMDIM into ..\QVD\itemdim.qvd;

drop table ITEMDIM;

-------------------------------------------------------------

8 Replies
Miguel_Angel_Baeyens

Hi,

Is that the same exact script you used in previous developments? Are fields case sensitive as they come from the database? Did you try deleting the QVD file and reloading the script again?

Hope that helps.

Miguel

Not applicable
Author

Hi Miguel,

Thanks for the reply.

Yes, this is the exact script, that I am loading everyday. The fields are not case sensitive, as I tried to run the same sql query directly in the database.

Deleting the QVD did help on a number of occassions. But thinking in that perspective I am sure that no other application is using the same QVD, so that it might be locked by some other process.

Also I am not be able to make sure, on which occassion the QVD will be stored in what size.

for eg: on consequetive 2-3 days it can be stored as 300 MB and suddenly on the next day, the size changes to original 4 GB.

Please note that on whichever day the QVD size is 4 GB, the task gets executed successfully, but the task always fails when the QVD size is 300 MB or 200 MB.

Miguel_Angel_Baeyens

Hi,

Did you try limiting the number of records you pull from the database? Does that happen in all cases even when you retrieve only 100 records?

As a side note, I meant that field names in QlikView are case sensitive. The query analyzer you are using may be case insensitive but if the field in the database is stored as "SN", then the LOAD part must use "SN" uppercase.

Hope that helps.

Miguel

Not applicable
Author

Hi Miguel,

I have tried a number of times, by limiting the data to 100 or 200 records only, but doing so, I never faced this problem. I only face the issue when the file is undergoing a full refresh. The latest number of records loaded in the last full refresh is 241,483,608.

The main problem is, its showing nothing as No. of records fetched in log file on a corrupted reload so that I can track it on each refresh.

And, the field names are same in the load statement as they are in the Database table.

Not applicable
Author

Hi,

Do you have a timestamp in every row in the table "IM_OWNER".ITEM?

If you did you would be able to store QVDs containing the data for each day rather than reloading 238M rows every day...

eg: store ITEMDIM into ..\QVD\itemdim-$(day).qvd

Once you have this you can concatenate your QVDs to restore the full dataset.

Then you should make a QVD control interface to monitor the number of rows in each QVD.

Have you considered this solution? dealing with 4Gigs QVDs doesn't seem to be ideal.

Hint: investigate incremental reload possibilities and get the qlikview cookbook template

Not applicable
Author

Hi All,

Thank you very much for your valuable inputs. This is much appreciated.

But in this case, the problem lies in a bit offset:

The rows corresponding to the field ID is getting updated on a daily basis in the database table which doesn't have any date field or any flag field, which can be used to load the data in incremental basis.

Also, as any row in the database can be updated on a daily basis, and the data fetched in the itemdim.qvd is used as a reference data to a number of applications, so it is needed to give a full refresh of this particular data table.

Also as none of the columns have any marker on them in the DB table, we cannot load the data by breaking it up into parts and store in different QVDs.

I have run this application a number of times in QV9 and I never faced this problem. I am facing the problem, since when I am using QV11 client to fetch the same data. Is it something related to QV11?

Not applicable
Author

Hi,

I had an issue with the left/right function when I upgraded to V11.

You should check if SN data type is the same than in V9... QV11 messed up with some data type for me and I had to change some string functions.

Not applicable
Author

Hi,

We have a similar problem in our DB.

The solution i found was using the oracle system change number. This is an internal field in oracle which keeps track of changes. Depending on the configuration it wil save it on row level or cluster level (i think it's cluster, it has been a while i read the documentation). Most likely it will be configured to cluster level as this is less taxing on the system. Anyway it should be better than loading the entire table each time.

For some more info :

https://blogs.oracle.com/security/entry/learning_more_about_oracle_database

An example of how i used it:

Incr:

LOAD MAX("ORACLE_SCN") as "ORACLE_SCN"

FROM $(vLocalDataPath)QVD\Table_Full.QVD(qvd);

Let vSCN = PEEK('ORACLE_SCN',-1,'Incr');

DROP TABLE Incr;

Table:

SQL SELECT ID, ....

     ora_rowscn "ORACLE_SCN"

FROM Table

WHERE ora_rowscn > $(vSCN);

CONCATENATE (Table)

LOAD *

FROM $(vLocalDataPath)QVD\Table_Full.qvd(qvd)

WHERE Not Exists(ID);

STORE Table INTO $(vLocalDataPath)QVD\Table_Full.qvd;