Thursday, May 6, 2010

Large optimization models in Excel?

I was asked to give my opinion on implementing a fairly large and complex optimization model in Excel. Excel is an extremely popular tool in business environments, and often there are people available with skills and experience that can do amazing things with Excel. Even in some academic circles the notion of doing serious optimization modeling in Excel is put forward: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1088420.

In my experience it is often a good idea to use Excel where it really excels: data-entry and reporting. Lots of the more complicated data manipulation and the implementation of model equations can often better be delegated to specialized modeling software.

I often use Excel Excel as front-end for optimization models. But I try to move data as quickly to the modeling system where we can do data preparation (and checks) in a more systematic way. Similarly, I try to prepare output data in the modeling system and pass it back as late as possible to Excel for reporting (display and graphing).

Of course where to put the boundary between Excel and the modeling system can depend on different criteria. Sometimes this is dictated by technology (a modeling system that does not allow you to do data manipulation forces you to do more in Excel; but more capable systems don’t have this restriction).

An interesting book about Excel is: Professional Excel Development. It has a somewhat different approach than traditional programming books.

2 comments:

  1. I tend to agree with your assessment completely. Excel for serious optimization is a terrible, terrible joke. Excel modeling has no notion of abstract model frameworks, branching, and looping over data. As a result, the models are frail, rigid, inflexible, and aren't able to scale.

    On the other hand, I do like Excel for outputting results. It can highlight data and put it into an aesthetically pleasing form very easily. My opinion: to output data into other forms would often be more work. Setting up a connection to reportlab, or LaTeX would require quite a bit of work, would be something of a one-off job that couldn't be reused, and you couldn't use the resulting .pdf to do back of the envelope calculations (eg calculate a ratio from reported numbers in a spare cell).

    ReplyDelete
  2. Having taught optimization modeling to MBAs using Excel, and having graded the homework, I can say that auditing small models in Excel is tedious but acceptable. Auditing realistically large models would be quite difficult (and hence would probably not happen). Putting Excel between the user and the actual modeling system makes sense in many contexts, but actually doing anything beyond small models in Excel is dicey.

    ReplyDelete