The structure of an algorithm often directly tells us its time complexity, but sometimes a straightforward analysis does not give a true picture of the efficiency. Amortized analysis can be used to analyze a sequence of operations whose time complexity varies. The idea is to estimate the total time used to all such operations during the algorithm, instead of focusing on individual operations.
Two Pointers Method:
In the two pointers method, two pointers walk through an array. Both pointers move to one direction only, which ensures that the algorithm works efficiently. As a first example of how to apply the technique, consider a problem where we are given an array of n positive integers and a target sum x, and we want to find a subarray whose sum is x or report that there is no such subarray.
The problem can be solved in O(n) time by using the two pointers method. The idea is to maintain pointers that point to the first and last value of a subarray. On each turn, the left pointer moves one step to the right, and the right pointer moves to the right as long as the resulting subarray sum is at most x. If the sum becomes exactly
x, a solution has been found.
For example, How the algorithm processes an array when the target sum is x = 8. The initial subarray contains the values 1, 3, and 2, whose sum is 6. Then, the left pointer moves one step right, and the right pointer does not move,
because otherwise the sum would exceed x. Finally, the left pointer moves one step right, and the right pointer moves two steps right. The sum of the subarray is 2 + 5 + 1 = 8, so the desired subarray has been found.
The running time of the algorithm depends on the number of steps the right pointer moves. While there is no useful upper bound on howmany steps the pointer can move on a single turn, we know that the pointer moves a total of O(n) steps during the algorithm, because it only moves to the right. Since both the left and right pointer move O(n) steps, the algorithm works in O(n) time.
2SUM Problem Another problem that can be solved using the two pointers method is the 2SUM problem: given an array of n numbers and a target sum x, find two array values such that their sum is x, or report that no such values exist.
To solve the problem, we first sort the array values in increasing order. After that, we iterate through the array using two pointers. The left pointer starts at the first value and moves one step to the right on each turn. The right pointer starts at the last value and always moves to the left until the sum of the left and right value is at most x. If the sum is exactly x, a solution has been found.
For example, Fig. 8.4 shows how the algorithm processes an array when the target sum is x = 12. In the initial position, the sum of the values is 1 + 10 = 11 which is smaller than x. Then the left pointer moves one step right, and the right pointer moves three steps left, and the sum becomes 4 + 7 = 11. After this, the left pointer moves one step right again. The right pointer does not move, and a solution 5+7 = 12 has been found.
The running time of the algorithm is O(n log n), because it first sorts the array in O(n log n) time, and then both pointers move O(n) steps.
Note that it is also possible to solve the problem in another way in O(n log n) time using binary search. In such a solution, we first sort the array and then iterate through the array values and for each value binary search for another value that yields the sum x. In fact, many problems that can be solved using the two pointers method can also be solved using sorting or set structures, sometimes with an additional logarithmic
The more general kSUM problem is also interesting. In this problem we have to find k elements such that their sum is x. It turns out that we can solve the 3SUM problem in O(n2) time by extending the above 2SUM algorithm. Can you see how we can do it? For a long time, it was actually thought that O(n2) would be the best possible time complexity for the 3SUM problem. However, in 2014, Grønlund and Pettie  showed that this is not the case.
Nearest Smaller Elements:
Amortized analysis is often used to estimate the number of operations performed on a data structure. The operations may be distributed unevenly so that most operations occur during a certain phase of the algorithm, but the total number of the operations is limited.
As an example, suppose that we want to find for each array element the nearest smaller element, i.e., the first smaller element that precedes the element in the array.
It is possible that no such element exists, in which case the algorithm should report this. Next we will efficiently solve the problem using a stack structure.
We go through the array from left to right and maintain a stack of array elements. At each array position, we remove elements from the stack until the top element is smaller than the current element, or the stack is empty. Then, we report that the top element is the nearest smaller element of the current element, or if the stack is empty, there is no such element. Finally, we add the current element to the stack.
How the algorithm processes an array. First, the element 1 is added to the stack. Since it is the first element in the array, it clearly does not have a nearest smaller element. After this, the elements 3 and 4 are added to the stack. The
nearest smaller element of 4 is 3, and the nearest smaller element of 3 is 1.
Then, the next element 2 is smaller than the two top elements in the stack, so the elements 3 and 4 are removed from the stack. Thus, the nearest smaller element of 2 is 1. After this, the element 2 is added to the stack. The algorithm continues like this, until the entire array has been processed.
Finding the nearest smaller elements in linear time using a stack:
The efficiency of the algorithm depends on the total number of stack operations. If the current element is larger than the top element in the stack, it is directly added to the stack, which is efficient. However, sometimes the stack can contain several larger elements and it takes time to remove them. Still, each element is added exactly once to the stack and removed at most once from the stack. Thus, each element causes O(1) stack operations, and the algorithm works in O(n) time.
A sliding window is a constant-size subarray that moves from left to right through an array. At each window position, we want to calculate some information about the elements inside the window. Next we will focus on the problem of maintaining the sliding window minimum, which means that we want to report the smallest value inside each window.
The sliding window minima can be calculated using a similar idea that we used to calculate the nearest smaller elements. This time we maintain a queue where each element is larger than the previous element, and the first element always corresponds to the minimum element inside the window.
After each window move, we remove
elements from the end of the queue until the last queue element is smaller than the new window element, or the queue becomes empty. We also remove the first queue element if it is not inside the window anymore. Finally, we add the new window element to the queue.
How the algorithm processes an array when the sliding window size is 4. At the first window position, the smallest value is 1. Then the window moves one step right. The new element 3 is smaller than the elements 4 and 5 in the queue, so the elements 4 and 5 are removed from the queue and the element 3 is added to the queue. The smallest value is still 1. After this, the window moves again, and the smallest element 1 does not belong to the window anymore.
Thus, it is removed from the queue, and the smallest value is now 3. Also the new element 4 is added to the queue. The next new element 1 is smaller than all elements in the queue, so all elements are removed from the queue, and it only contains the element 1. Finally, the window reaches its last position. The element 2 is added to the queue,
but the smallest value inside the window is still 1.
Since each array element is added to the queue exactly once and removed from the queue at most once, the algorithm works in O(n) time.