Wayyyyy back in April I attended the Informs Conference on Business Analytics and Research in Chicago. This conference was renamed from the "Informs Practice Conference on Operations Research" to broaden the appeal: operations research is essentially synonymous with prescriptive analytics. Apparently the rebranding worked: the conference had its highest attendance ever (729 attendees: 75% industry-based, 20% academic, 5% military/government).
There are basically four things I like to do at INFORMS: software workshops, exhibits, the Edelman awards, and presentations. Here are my notes.
Please note: this post is for informational purposes only and does not endorse any particular product. If this post is interpreted that way then I will have to take it down!
My first workshop was by Palisade who build the @Risk Excel add-in for Monte Carlo simulation. The format was a walkthrough of @Risk’s features. @Risk is an add-in that provides a ribbon interface for defining, running, and reporting on simulations; custom Excel formulas for Monte Carlo simulation; and custom UI for visualizing results. Simulations are created from standard Excel spreadsheets. An input cell can be associated with a probability distribution by clicking on the appropriate button in the ribbon. Output cells can be turned into simulations (forecasts) in a similar fashion. These actions modify the cells by wrapping the original formulas inside of custom Palisade formulas.
Running a simulation amounts to recalculating the spreadsheet multiple times. The results are collected by @Risk and surfaced to the user in various ways. If you’re familiar with other simulation tools such as Crystal Ball or Risk Solver, none of this is new. Some notes about @Risk:
- It’s easy to use. For example, the standard output is a histogram chart with summary statistics on the side. The histogram has two sliders that divide the histogram into three vertical slices. At the top are percentage of scenarios that belong to each slice. So if I want to see what percentage of the time my portfolio is underwater, I just drag the slider to 0. If I want to compare the results of multiple simulations, I click a button and see the two histograms overlaid.
- Choosing the right number of trials for a simulation can be tricky. @Risk has an "automatic" option that simply tests certain summary statistics every 100 iterations and stops when they stop bouncing around. Not foolproof, but simple.
- Simulations run through the Excel calc chain, so calculation inconsistencies are avoided.
- @Risk provides very simple, but useful reporting capabilities. Click on a button and the relevant histograms, scatter plots, and summary statistics are written to a worksheet, and the print area is formatted so it will fit on a single page.
As the presenter himself said, the promise is not "deep statistics" but ease of use and flexibility.
My second workshop was by Gurobi Optimization. Since MIP is the most important category in the world of solvers, I always like to keep up-to-date on what’s going on with Gurobi. (Gurobi supplies the default MIP solver for Solver Foundation.)
Over 6600 free academic licenses used: 60% outside the US, 50% outside the OR community.
Gurobi Cloud has been around for a couple of years. They discovered that they have two types of cloud customers (quoting):
- “People dedicated to cloud computing
- People who want access to powerful computers at a moment’s notice
- People who wanted a cheaper way to get access to Gurobi.
- Especially when they only need it occasionally"
So they’re introducing "pay by the day licenses". You pay only for days you need Gurobi at the rate of $200 / day. The installation procedure is the same as the non-cloud version: purchase a license key and drop it in a folder.
My third workshop was by Ziena Optimization who build the Knitro suite of solvers for nonlinear (differentiable) optimization. In brief:
- It was a practical, practitioner oriented session that showed all of the ins and outs of working with Knitro.
- It was great to see the presentation on the Solver Foundation connector for Knitro.
- Attendees want second derivative (Jacobian) support in MSF. The MSF team received the feedback!
- Another attendee would like Solver Foundation to have better presolve (linear and nonlinear). I agree this is a shortcoming.
I visited nearly all of the vendor booths and I have some marketing and technical materials for many of them in my office. Here I will only make note of a few:
IBM had a huge, three panel booth. IBM has a broad-based pitch aimed squarely at businesses who require analytics.
NAG is a UK-based company that builds numerical libraries of various kinds, including optimization solvers.
Frontline Systems (makers of the Excel Solver as well as Risk Solver Platform) had a large booth and a strong presence.
- 18 talks on supply chain management.
- The "Analytics Process" track was the most popular.
- There was a great Walmart talk that described how they use analytics in all phases of business (including HR). The most interesting part was a side discussion about R. The Walmart people hate R – they use SAS instead. They flatly refuse to use R because they don’t have full trust in the results, and they don’t know where to turn to for support. IMHO they don’t get the value of R, but on the other side the R people don’t always understand the requirements of business: support, accountability, etc.
There were six finalists for the Edelman Award, presented on Monday night. The organizers deem the Edelmans the "Super Bowl of Operations Research". In each case, the organization implementing the solution was assisted by another company who supplied modeling and software expertise.
Each of these entries are outstanding examples of the importance and impact of operations research. The dollar amounts (in terms of revenue/cost savings realized) are in the multi, multi millions.
I refer you to https://live.blueskybroadcast.com/bsb/client/CL_DEFAULT.asp?Client=569807&PCAT=3074&CAT=3075 for descriptions and video.