But "amortized" is irrelevant in the context of quicksort, as it is not an amortized algorithm. You may hit an unlucky streak and get the worst case of O(n^2) behaviour on every call. An amortized algorithm on the other had would guarantee that while each individual operation on a data structure may hit the worst case, it will always (and not just in the average case) average out over many operations on the same data structure.
Doh, screw up on my part. You are completely right. An amortized algorithm would be something like a dynamic array (std::vector or ArrayList) in this case, which is doubled every time it hits capacity.