https://groups.google.com/forum/#!topic/gurobi/zdO8cqJIIv4.
My usual rule of thumb is that up to 10 million equations is doable on normal PC hardware. This is 15000 times over that amount. I am using here the US notion of a billion (i.e. \(10^9\) instead of \(10^{12}\)).
In some cases I feel models are overly detailed. Often the answer to results that don’t match expectations is: “we need to add more detail”. I am convinced this is not always the correct conclusion. Of course building and solving large models is more sexy than small models. But much insight can be learned from really small models.
A problem with \(1.5 \times 10^{11}\) equations probably needs a ton a of memory, say in the order of hundreds of terabytes. Is there a machine with with several hundred terabyte of memory? The Titan supercomputer has about 500 terabyte memory (the link expresses memory in Tebibytes – I had to look that up).
No comments:
Post a Comment