Thursday, December 23, 2010

Shortest Slowest path

A few weeks ago, I had was presented with a question: "You are given an n X n matrix. Each cell has a non-negative integer that denotes the cost of landing on that cell. Start from the top-left corner, compute the cost of the least-cost-path to reach the bottom-right corner of the matrix such that the sum of all costs incurred during the traversal is the least. Additionally, you are allowed to move only down or right."

Of course, this is just a slight variation of Dijkstra's Shortest Path problem, but I was surprised at the first solution that came to mind. I was trying really hard to not use Dijkstra's shortest path algorithm to solve this problem. I landed up linearizing all the n2 cells into n2 vertices in a graph and applied the Floyd-Warshall algorithm on the resulting matrix (of having side n2). Hence, the total complexity shot up to a whooping O(n6) (n being the side of the original matrix). If I had used a standard BFS or Dijkstra's algorithm, I would have been able to solve it with a runtime complexity of O(n2 log(n)).

However, this was a very interesting exercise in functional programming since I was able to piece together a solution by chaining solutions to existing problems. The composition looks something like this:
Solution(matrix) = FloydWarshall(Linearize(matrix))

1 comment:

Tarik's Blog said...

Have you considered using dynamic programming?