Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
We have one table that is in mysql and is 10 million records. We re-load the full table every day. I keep getting an error about failure to read, but I can run the query manually. Are there any tricks to loading large tables?
How are you reloading the file? Publisher or . . . .
Stephen
Publisher at night, but I made a schema change so manually right now. I tried locally then I tried manually from the server to use more resources and improve stability. Both gave the same read error.
We have the same problem loading tables from an Microsoft Access. We created a batch file (.bat) and schedule the file to run.
The batch file is:
psexec -u UserName -p Password "c:\Program Files (x86)\QlikView\qv.exe" -r "Q:\Path\QlikViewDocument.qvw"
The problem seems to be that Publisher has a problem reloading with an ODBC connection.
Stephen
Strange our seems to be opposite, where publisher loads fine but manually with ODBC it doesn't. So it doesn't seem to be a publisher issue.