> Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.
What I described is the same thing in layman's term. Worst case is the colloquial word for upper bound. And in that example n was approaching 1000.
If you want to be puritan the only fault I see in my definition is instead of using a generic function I assumed it's linear function - but that's for explaining the colloquial use.
It's just an asymptotic upper bound (though often used to imply tight bounds).
It's most commonly applied to worst case running time, but is often applied to expected running time ("hash table insertions run in O(1)"), space complexity, communication overhead, numerical accuracy and any number of other metrics.