Recently I was involved a few models where GAMS was generating a lot of results, exceeding what we could easily store in Excel or Access. Here is an alternative: store the results in SQL server.
Here is a small two-liner example script that shows how we can efficiently import a very large parameter with 8 million records residing in a GDX file into a SQL server table.
$call gdx2txt.exe in=x.gdx id=p out=p2.txt tab
The first line exports the parameter from a GDX file to a tab delimited text file. The tool gdx2txt is a little bit more convenient than using GAMS PUT statements and it is also faster (using the same precision more than 5 times as fast).
The second line calls the BCP utility to import this text file. The hints setting the batch size and requesting a table lock increase the performance.
The log looks like:
--- Job gdx2txt3.gms Start 02/24/12 23:09:02 WEX-WEI 23.7.3 x86_64/MS Windows
GDX2TXT: 7800000 records (elapsed 11201 millisec)
8000000 rows copied.
The total turn around time for this test data was 35 seconds. About 11 seconds for GDX2TXT and 24 seconds for BCP. So the database import is only 2.5 times as slow compared to my time to generate the CSV file. That is very fast indeed. The table had four columns: three GAMS indices (converted to type NCHAR(4)) and a value (type REAL). No (SQL) indices or keys were used against the table so the inserts could be done at optimal speed. Also the db file sizes were extended so they did not need to grow during the inserts. As a result, in practice the BCP timings may be a bit slower than shown here.
Note: dropping all rows by “DELETE FROM testdb.dbo.result” can take a long time (1.5 minutes on my machine). Much faster is “TRUNCATE TABLE testdb.dbo.result”.