We will look at the single source shortest path problem using Bellman Ford Algorithm, but first let’s understand the concept of Dynamic programming.
Dynamic programming in computer science refers to a method of simplifying and solving problems, by breaking the problem into sub-problems and solving those sub-problems recursively to get the general solution to the original problem.
There are two attributes a problem should have in order for dynamic programming to solve the problem correctly, they are:
- Optimal substructure: this means that a solution to a given problem can be solved by combining the solutions to its sub-problem, example: you want to get to the 10th floor of a building, you are going to have to find a way to get to the first floor and second floor and third floor, all the way up to the 10th, you get to the 10th floor by climbing the ones below it, assuming you are not a superhero.
- Overlapping sub-problems: this means that the sub-problem should be the same and can be solved recursively. Ie the sub-problems should be the same so that it can be solved in a repeat manner, take the example above, the sub-problems are the floors of the building, and we solve it by climbing those floors.
Steps of Dynamic Programming
- Define sub-problems
- Guess part of the solution
- Relate sub-problems to solution
- Store solutions to sub-problems
- Solve original problem
Let’s talk about the fourth step store solutions to sub-problems
There are two ways to store the solutions to sub-problems:
- Build a table ie bottom up approach
Memoization simply means storing the results of expensive function calls and returning a cached result when the same function call occurs again, in order to optimize and speed up programs. Ie when we first encounter a particular sub-problem, we solve it and store the solution in a data structure, so that when you encounter the same sub-problem, we can return the solution we got from the first sub-problem.
Memoization only works in a Directed Acyclic Graph DAG, because a sub-problem may be called before the solution is stored in memo.
Bottom up Approach means we start solving from the lowest level of sub-problems and accumulate solutions until we reach the top ie Original problem. After we divide our problem into sub-problems, and those sub-problems into smaller ones, we then start solving from the first smallest sub-problem, reusing solutions to sub-problems to solve a larger one. Because you start solving from the base case ie lowest level of sub-problems, the order in which the problem should be solved must be determined ahead of time.
Bellman form algorithm follows the principle of Dynamic programming to solve the single source shortest path problem. Richard Bellman in fact, developed the Dynamic programming concept.
Given a graph with “n” number of nodes, Bellman ford algorithm will relax all the edges for “n - 1” times, we are performing relaxation on an edge for “one less than the number of nodes in the graph” times, because we don’t need to calculate the distance of our destination node to another node. We start at our source and relax the edges leading to neighbouring nodes, then we pick those neighbouring nodes and do the same thing for them. After relaxing the edges for “n - 1” times, starting from the source, we follow the path with the smallest edge weight/distance.
Relaxing the edges for “n - 1” times – overlapping sub-problems.
If implemented correctly, the sub-paths of the shortest path are the shortest sub-paths, hence we are following the shortest sub-paths to get the shortest general path to our destination – optimal sub-structure.
function BellmanFord(list vertices, list edges, vertex source)is distance = list of size n predecessor = list of size n // Step 1: initialize graph for each vertex v in vertices do distance[v] = infinity //set distance of all vertices to infinity predecessor[v] := null // And having a null predecessor distance[source] := 0 //set distance of source to be 0 // relax edges repeatedly repeat |V|−1 times: for each edge (u, v) with weight w in edges do if distance[u] + w < distance[v] then distance[v] := distance[u] + w predecessor[v] := u // check for negative-weight cycles for each edge (u, v) with weight w in edges do if distance[u] + w < distance[v] then error "Graph contains a negative-weight cycle" return distance, predecessor
Although Bellman ford has an advantage over Dijkstra with dealing with graphs with negative edges, Bellman ford fails if there is a negative weight cycle in the graph ie total weight of edges in cycle is negative. However, it can indicate if there is a negative weight cycle.