CompSust09 Impressions, Part I
I just attended Computational Sustainability 2009. It was an interesting conference about directing computational methods towards the pursuit of sustainability. Presentations covered a number of different computational approaches (such as machine learning and simulation) and a range of sustainability issues (such as energy and atmospheric modeling). But the most dominant issue was species modeling; considerably more talks and panels covered this than any other topic. And among several of these, big buzzwords were “POMDP”, or partially observable Markov decision processes, and the “curse of dimensionality”. Ecologists face a large degree of uncertainty in both observations and model structure. The large numbers of states in their systems combined with the uncertainty make it difficult for them to get very far with their models played out over time. So as I understood it, these ecologists were looking for help from their computer scientist colleagues in making the model-solving more tractable. It could be a good opportunity for operations researchers to help. I think OR can help not in finding solution methods for these huge models, which may not make all that much sense to do, but in focusing on what variables/states/uncertainties matter most, simplifying the models, and developing heuristics to come up with good solutions quickly that can dynamically adapt to new data. Stay tuned for a possible guest post with more on this soon.
OR came up in talks a number of times. Warren Powell of Princeton’s OR & Financial Engineering group spoke about adaptive dynamic programming (ADP) applied to modeling of energy (including renewables) under uncertainty. He discussed how ADP avoids the so-called “curse of dimensionality” associated with dynamic programming. So it seems like this might be of use to some of the species modelers mentioned above. Go to the CASTLE Laboratory site and scroll down to energy for more info.
David Shmoys of Cornell’s OR group linked a telecom network design problem to habitat protection. In the survivable network design problem, each pair of nodes must have a set number (rij) of edge-disjoint paths between them. (If rij = 1 for all pairs of nodes, we have the minimal spanning tree problem.) It turns out that this can be formulated as a max flow problem. In the habitat protection problem, the goal is to acquire regions over time for the protection of a species. He used the example of a woodpecker. Nodes represent the set of regions over a sequence of time steps (e.g. 3 regions over 4 time steps would require 12 nodes). Edges represent woodpeckers persisting in a region or diffusing to another region, and are directed forward in time. Like the telecom problem, the goal here can be viewed as maximizing flow across the network. The model turns out to be stochastic, because,
if I recall correctly, there is uncertainty in the numbers of woodpeckers in each region it involves probabilities of woodpeckers from a region occupying the other regions in consecutive time steps (thanks Adam E for the clarification below). Shmoys coupled simulation with an LP-based heuristic to find quality solutions in a reasonable amount of time for the problems being considered.
There was more going on at the conference worth mentioning (maximum entropy modeling, species reserve knapsack problem) that I will try to get to next time…