Hello,
I needed to read whole table from database (around 80000 rows). I used DBS_READ with value maxRows set as -1. System returned me 1 row with null values and retCode 35. I figured out that there is some restriction for maximum rows for DBS_READ and i think for SQL_SELECT too. So i used cycle for reading the table with DBS_READ and after that i updated the rows i needed. I am just looking for advice, is there any other better option to read that amount of data and if there was more rows in the table, what about reading speed then ?
As DBS_READ documentation says:
Value of the parameter maxRows determines the maximal number of rows. If this value is set to -1, then the system will attempt to read all rows matching the specified condition. Maximum number of rows is limited by a Database configuration parameter Maximum returned rows. Exceeding this number, the first Maximum returned rows will be stored in local structure localStructIdent and retCodeIdent_Int will be set to _ERR_DATABASE_ROWS_LIMIT.
So, the limit (default 10000 rows) can be change on Database object. But we recommend using instead page-oriented functions, which can read data from table in pages (of user-defined size, e.g. 1000 rows). This way you don’t have to wait for all data to be available (plus, you don’t have to hold it all in memory of Dbmanager/Event processes), but you read & process it in chunks.
Please see documentation of
PG_CONNECT
PG_READ
PG_DISCONNECT
There is also an example of using PG_* functions available.