1. *Fibonacci Sequence*: The classic example where the naive recursive solution has exponential complexity. By storing previously computed values (memoization), the complexity can be reduced to linear.
2. *Coin Change Problem*: Given different denominations of coins and a total amount, finding the number of ways to make the change. The naive approach is exponential, but dynamic programming reduces it to polynomial complexity.
3. *Knapsack Problem*: Particularly the 0/1 Knapsack problem, where items with given weights and values must be placed in a knapsack of a fixed capacity to maximize total value. The naive exponential solution can be optimized using dynamic programming.
4. *Matrix Chain Multiplication*: Determining the most efficient way to multiply a chain of matrices. The problem can be solved in exponential time using a naive approach but becomes much more efficient with dynamic programming.
5. *Longest Common Subsequence*: Finding the longest subsequence common to two sequences. A classic dynamic programming problem that can be solved in polynomial time.
6. *Longest Increasing Subsequence*: Finding the length of the longest subsequence of a given sequence such that all elements of the subsequence are sorted in increasing order.
7. *Shortest Path Problems*: Like the Floyd-Warshall algorithm for finding the shortest paths in a weighted graph with positive or negative edge weights.
8. *Edit Distance (Levenshtein Distance)*: Finding the minimum number of edits (insertions, deletions, substitutions) needed to change one word into another.
9. Longest Path in DAGs: Find a longest path in a directed acyclic graph.
10. Weighted Independent Set on a Path: Given an array of integers, compute the maximum sum of numbers provided that you may not take two consecutive cells.
I am a big fan of the Knight Dialer. I wrote an article on it and how you can actually use graph theory to reduce it to an incredibly efficient 4x4 matrix. was super fun.
2. *Coin Change Problem*: Given different denominations of coins and a total amount, finding the number of ways to make the change. The naive approach is exponential, but dynamic programming reduces it to polynomial complexity.
3. *Knapsack Problem*: Particularly the 0/1 Knapsack problem, where items with given weights and values must be placed in a knapsack of a fixed capacity to maximize total value. The naive exponential solution can be optimized using dynamic programming.
4. *Matrix Chain Multiplication*: Determining the most efficient way to multiply a chain of matrices. The problem can be solved in exponential time using a naive approach but becomes much more efficient with dynamic programming.
5. *Longest Common Subsequence*: Finding the longest subsequence common to two sequences. A classic dynamic programming problem that can be solved in polynomial time.
6. *Longest Increasing Subsequence*: Finding the length of the longest subsequence of a given sequence such that all elements of the subsequence are sorted in increasing order.
7. *Shortest Path Problems*: Like the Floyd-Warshall algorithm for finding the shortest paths in a weighted graph with positive or negative edge weights.
8. *Edit Distance (Levenshtein Distance)*: Finding the minimum number of edits (insertions, deletions, substitutions) needed to change one word into another.