Why don’t mapping services use robust optimization?

It sure would be great if mapping services like Google Maps and Bing Maps would make better use of robust optimization. I know from being “on the inside” that the techniques used to find the best route from A to B are incredibly sophisticated.  It’s far more complicated than simply solving a shortest path problem; they prune silly solutions using a variety of techniques. Mapping services do an outstanding job with directions most of the time, but of course that won’t stop me from complaining. It may very well be that Google/Bing/etc already do this kind of thing and my assumptions are wrong – in which case this blog post will look pretty silly. I guess that’s a risk I will have to take.

Suppose you’re planning a trip from Evanston, Illinois to my home town of Cedar Falls, Iowa. Here’s what pops up for Google (and Bing) for the first segment of the trip. Google reports the driving time as 5 hours, 1 minute.

Directions from Evanston using interstate

I’m being asked to hop on 94 and hook up with 90 W. (The remaining steps are left as an exercise for the reader.) Now it turns out that if avoid 94 and take Golf Road, the estimated time is 5 hours 5 minutes (4 more minutes):

Evanston to Iowa using Golf road

I don’t know for sure, but my guess is that there is more variance on the 94 – 90 route than the Golf Road route because of traffic considerations. Big interstates running through big cities get jammed up. Yes, you have to contend with traffic lights on the Golf Road route, but that’s probably more predictable than I-94 traffic. So while the best or average case may be slighty better for 94-90, there are probably a larger percentage of scenarios where there are significant delays. If we compared the two routes over a range of plausible traffic scenarios, I bet the expected time for the Golf road route over all routes would be shorter. (This is a key principle in stochastic programming: if uncertainty is involved taking the “average solution” is not always your best move – see for example Jeff Linderoth’s lecture notes from a few years ago.)

The moral of the story: if two routes have similar average durations, give me the one with lower variance.

Author: natebrix

Follow me on twitter at @natebrix.

13 thoughts on “Why don’t mapping services use robust optimization?”

  1. I’ve done a lot of portfolio optimization in my time. One of the inputs is “risk aversion”, used to calibrate the risk penalty against the return benefit. Given a predictive model for each of return and risk, one can create a variety of portfolio along the efficient frontier just by varying the risk aversion. Then you match the portfolio to the investor.

    How would you calibrate individual risk aversioninto a mapping optimization? I bet I’m more risk averse than most, when it comes to travelling.

    1. I love that idea. The concept of an efficient frontier carries over to a number of areas besides portfolio optimization, and this is definitely one of them. When building a service for mass consumption there is always a tradeoff between additional features and simplicity. I would imagine that if a mapping service chose to take this on it would simply end up as an additional “dropdown” option. Even something like a slider (to manage the tradeoff between risk and return) may be deemed to be too complicated.

  2. Thanks for the shout out! It’s what Sam Savage calls the Flaw of Averages. http://flawofaverages.com/ Did you hear the one about the statistician who drowned crossing a river that was only three feet deep on average?

    And given my experience driving between Madison and Evanston, I would propose a possibly “more optimal” solution — Take the Tri-State down to 90.

    1. Anytime😉 Yes! The Flaw of Averages is a great book – entertainingly written and insightful.

      As far as the trip goes, I actually have a multiobjective problem since I like to avoid big interstates and toll roads as much as I can. Another simple thing that might help in this regard would be the ability to click on a step in the directions and say “don’t take this road”.

  3. Google uses current traffic conditions to optimize its route, so it shouldn’t have so much variance. My android phone has told me to take some crazy routes that I thought were bugs, only to find out it was saving me from getting stuck in serious traffic! Of course, you have to either have an android phone or print your map immediately before leaving to get the benefit of this.

    1. There are multiple issues at play here – my post is only addressing one of them. Yes, mapping services can and do account for traffic. Whether or not traffic is accounted for, the next question is whether the mapping service is simply using the average time for the route given current conditions, or whether it is considering a range of possible outcomes.

      My point restated is that if there are two routes given current conditions whose average durations are pretty close to each other, I want the one with a narrower range of possible durations.

      Two short notes related to incoporating traffic. 1. in defense of Microsoft, bing has this too (click “view route based on traffic”). 2. In my scenario I am planning a trip for this weekend, so I’d want to plan based on expected traffic on Friday.

  4. Two thoughts … First, I would be surprised if the mapping services have the data necessary to estimate variances. Second, I’m less concerned about variance than about specific, predictable phenomena (rush hour traffic jams, trains at crossings, sports events …).

    1. 1) If they don’t now, they soon will with the proliferation of mobile devices. Big data!

      2) Yeah, it would be great if not only current traffic conditions could be accounted for, but also estimated conditions in the future. In which case the variance should be reduced and what I was talking about is not as big of a concern. But I still would like to express my intent that I want a route where I am less likely to get screwed (especially since traffic congestion is a really hard thing to model so filtering the time series for predictable pheonomena still leaves a lot of room for unwelcome surprises).

      1. To avoid getting “screwed”, you might be better off with estimates of key fractiles of the travel time distribution rather than range or variance. I don’t think the distribution is symmetic — physics and traffic laws apply a fairly tight lower bound, but the upper bound is, well, scary.

  5. I’ve recently been using INRIX (http://www.inrix.com/) to get improved traffic-aware mapping for my Houston-Austin commutes. When you ask for them to calculate a route for you, you give them not just your origin and destination, but what time you’re leaving, and the route suggestions do indeed change based on that data. I’ll bet they have enough data to do crude robust optimization.

  6. Given the objective, I agree. Yet, why the hustle? Unless it’s a time-critical trip, I would pick a more scenic road and find a nice diner. Instead of getting obsessed with saving time (for what — more time reading papers?), I would spend it nicer. I would appreciate a map service offering such route alternatives.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s