Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>In this case, "the Lipschitz property" means that the gradient of the distance value is bounded

This is total nonsense. The point of Lipschitz continuity is that it is more than continuity and less then differentiability. If you assert that it is differentiable the concept looses all meaning. It is specifically interesting because you do not have to assert differentiability.



(OP here)

This is taken directly from the paper's introduction, which admittedly uses the more specific terminology of "1-Lipschitz signed distance bounds".

The paper cites the original Hart '96 paper on sphere tracing; quoth Hart, "a function is Lipschitz if and only if the magnitude of its derivative remains bounded".

https://graphics.stanford.edu/courses/cs348b-20-spring-conte...

I wonder if there's a terminology schism here between computer graphics and numerical analysis folks.


The concept of a Lipschitz function comes from mathematical analysis; neither computer graphics nor numerical analysis. It's straightforward to find the definition of a Lipschitz function online, and it is not in terms of its derivative. If a function is differentiable, then your quote applies; but again, it isn't the definition of a Lipschitz function.

I'd say this is a little pedantic, save for the fact that your function of interest (an SDF) isn't a differentiable function! It has big, crucially important subset of points (the caustic sets) where it fails to be differentiable.


>I wonder if there's a terminology schism here between computer graphics and numerical analysis folks.

The first group just pretends every function has a derivative (even when it clearly does not), the other doesn't.

The linked Wikipedia article gets it exactly right, I do not know why you would link to something which straight up says your definition is incorrect.

There is no point in talking about Lipschitz continuity when assuming that there is a derivative, you assume that it is Lipschitz because it is a weaker assumption. The key reason Lipschitz continuity is interesting because it allows you to talk about functions without a derivative, almost like they have one. It is the actual thing which makes any of this work.


There are functions that are differentiable, but not Lipschitz, e.g. sin(1/x) on (0, ∞)). In this context the Lipschitz property allows you to do things that mere differentiability doesn't, and, in the context of assuming that your function is differentiable, the definition given for the Lipschitz property is correct. It's true that there are also functions that are Lipschitz-continuous but not differentiable, but in this context they aren't interesting.


> It's true that there are also functions that are Lipschitz-continuous but not differentiable, but in this context they aren't interesting.

They absolutely are, even in the article. E.g. the max function or the 1-norm are non-differentiable, but continuous.

There is literally no point in talking about Lipschitz continuity for differentiable functions, because it is equivalent to the derivative being bounded.


Hmm, I guess you're right. But those functions (highly relevant here!) are differentiable almost everywhere; is that enough for Lipschitz continuity to be equivalent to the derivative being bounded?

I still think there's a point in talking about Lipschitz continuity for differentiable functions because the algorithms they're talking about depend on the Lipschitz property, not the differentiability.


>because the algorithms they're talking about depend on the Lipschitz property, not the differentiability.

That is why the definition should not contain the differentiability condition. It is violated by the examples and irrelevant.

My point is that the Lipschitz property, as defined e.g. in the Wikipedia article, is the important part. Also assuming differentiability is unhelpful, because it misses the point and is even violated by the rest of the article.


You seem to have sedulously avoided answering my question.


The article is talking about functions whose changes over an displacement dx are not just bounded by K |dx| for some K, but specifically by the value K=1. So it's not the same thing as the Lipschitz property in analysis, although I guess it's inspired by it--but as a result it's concrete and useful for algorithms.


>The article is talking about functions whose changes over an displacement dx are not just bounded by K |dx| for some K,

It is also talking about functions which do not have a derivative at all and therefore could not possibly have that property. The point of Lipschitz functions is that they need not have a derivative.

>So it's not the same thing as the Lipschitz property in analysis, although I guess it's inspired by it--but as a result it's concrete and useful for algorithms.

That is just some arbitrary scaling. Only relevant to have the math look nicer. E.g. if f is Lipschitz for L=10, then f/10 is Lipschitz for L=1.


> It is also talking about functions which do not have a derivative at all and therefore could not possibly have that property [that their changes over an displacement dx are bounded by K |dx| for some K]

Functions that do not have a derivative at all can certainly have that property. Perhaps you were confused by the notation "dx", suggesting a differential, for the displacement, rather than the conventional Δx, but the sentence makes no sense if we try to read it as a statement about differentials. Possibly your correspondent is not a Greek Keyboard Achiever. Moreover, the functions they are discussing, while indeed not differentiable, are continuous and differentiable almost everywhere, so you can also prove Lipschitz continuity if you can prove a bound on their gradients at the points where they do exist.

> That is just some arbitrary scaling. Only relevant to have the math look nicer. E.g. if f is Lipschitz for L=10, then f/10 is Lipschitz for L=1.

It's a relevant difference if you don't know the Lipschitz constant L or a bound on it, because then you don't know what to divide by.


>Functions that do not have a derivative at all can certainly have that property. Perhaps you were confused by the notation "dx", suggesting a differential, for the displacement, rather than the conventional Δx, but the sentence makes no sense if we try to read it as a statement about differentials. Possibly your correspondent is not a Greek Keyboard Achiever. Moreover, the functions they are discussing, while indeed not differentiable, are continuous and differentiable almost everywhere, so you can also prove Lipschitz continuity if you can prove a bound on their gradients at the points where they do exist.

The Lipschitz property is not a property of functions with a derivative. I really do dislike this kind of drive by commenting, where you ignore an entire conversation and pretend someone does not understand what a gradient is.


Functions with a derivative can definitely have the Lipschitz property. The “examples” section of https://en.m.wikipedia.org/wiki/Lipschitz_continuity begins with a subsection entitled, “Lipschitz continuous functions that are everywhere differentiable”. Most of the examples given there are differentiable in varying ways. (Unqualified “differentiable” normally means “everywhere differentiable”.)

You're also contradicting your previous comment at https://news.ycombinator.com/item?id=44145078.

I was trying very hard to ascribe some sort of sensible interpretation to your comments, but due to your aggression, now I regret having done so.


You seem really confused about something. The article is talking about SDFs that do not have that property, which it is then trying to force to have that property via a hack of sorts (f/|del f| apparently), because once the property holds it is very useful for making the algorithm for more efficient.

The arbitrary scaling is not arbitrary if the whole algorithm is defined around K=1. The Lipschitz property in analysis is "bounded by any K". The algorithm is specifically using "bounded by K=1".

Also: for the purpose of numerical methods, it does not matter if a function "has a derivative" in a classical sense or not. Certainly distributions, step functions, etc are fine. Numerical methods cannot, after all, tell if a function is discontinuous or not. The article is using some known closed-form functions as examples, but it is trying to illustrate the general case, which is not going to be a closed form but just an arbitrary SDF that you constructed from somewhere---which it will be very useful to enforce the K=1 Lipschitz condition on.


I think there are probably different populations commenting here with different backgrounds, and consequently different assumptions and different blind spots. For example, for Newton's Method, it does very much matter whether the function has a derivative in the classical sense everywhere in the relevant interval; the quadratic convergence proof depends on it. Discontinuous functions can break even binary chop.


This is about differentiability.

>which it is then trying to force to have that property via a hack of sorts (f/|del f| apparently)

The functions he talks about do not have a derivative. So that hack is nonsensical.


Max, min, and the L1 norm? Those have derivatives (gradients) almost everywhere. The derivatives are undefined at a set of points of measure zero, and the functions are continuous across those gaps.


And that means, by definition, that they do not have a derivative.

If you don't believe me open a real analysis textbook.


You seem to be under the mistaken and misguided impression that the only kind of derivative is the one in a real analysis textbook.

I bet you think delta functions aren't real either, lol.


Are Dirac deltas hyperreal? Asking for a friend.


Evidently they do? Maybe there is some domain knowledge you're missing here as you belligerent protest that it's nonsense.


If a function is Lipschitz, then it is differentiable almost everywhere by Radamacher's theorem. Moreover, if you want to prove things about Lipschitz functions, you can often (though not always) prove things about continuously differentiable functions with bounded gradients, and then lift to all Lipschitz functions via an appropriate compactness argument.


This is a great point... and Rademacher's theorem is one of my favorites. But this is in the context of a numerical algorithm which proposes to replace interval arithmetic with a direct evaluation of a gradient. For an SDF, the gradient fails to exist in many places; and since we're working with it on a computer and not doing math, this algorithm will eventually evaluate the gradient at a point where it is undefined (finite number of points, p > 0). On the other hand, it's straightforward to come up with a useful interval inclusion for the gradient which is defined even in these places (it should contain the subgradient of the function at each point). So, I am personally not convinced of the value of the proposed approach.


Yeah in context I somewhat agree, though the utility for graphics applications probably comes down to some more empirical aspects that I won't conjecture about. I imagine there is some stuff you could do in this setting by incorporating autograd derivatives from many slightly perturbed points in a neighborhood of the input point (which together act as a coarse approximation of the subdifferential set).


Exactly! That is why it is totally superfluous to define Lipschitz continuity in terms of differentiable functions.


> This is total nonsense. The point of Lipschitz continuity is that it is more than continuity and less then differentiability. If you assert that it is differentiable the concept looses all meaning.

How is Lipschitz less than differentiability? e^x is differentiable everywhere but not Lipschitz continuous everywhere, right?


>How is Lipschitz less than differentiability? e^x is differentiable everywhere but not Lipschitz continuous everywhere, right?

You should compare the local properties.


Nobody said local anywhere though, right? You just said the statement was nonsense where it clearly makes sense at least globally.

But now that we've moved the goalposts to local: what about f(x) = x^(3/2) sin(1/x), f(0) = 0? It's differentiable yet not locally Lipschitz-continuous on (0, 1]. (Taken straight from Wikipedia)


>Nobody said local anywhere though, right?

I was speaking in very general terms about the subject, I was not making mathematically correct statements. Your are completely correct about the counter example.

You actually need some stronger assumptions, namely that f is C^1, meaning it has a continous derivative. In that case any function f defined on a compact set K has a continuous derivative f' which assumes its maximum on K, giving Lipschitz continuity. Every function in C^1 on K is Lipschitz, yet not every Lipschitz function is C^1 or even differentiable.

To be more specific and to give the reason Lipschitz functions are important, look at the following statement from Wikipedia. "A Lipschitz function g : R → R is absolutely continuous and therefore is differentiable almost everywhere, that is, differentiable at every point outside a set of Lebesgue measure zero."

Given this I hope that you can understand what I said about Lipschitz functions being a weaker version of differentiabilty.


> This is total nonsense... If you assert that it is differentiable the concept looses all meaning.

Citation please.

Yes Lipschitz functions need not be differentiable but when they are that bound holds and is very useful. Using that bound is bread and butter in convex analysis of differentiable convex functions.

Curb the snark.


>Citation please.

The article links to the Wikipedia page, which gets it right.

"An everywhere differentiable function g : R → R is Lipschitz continuous (with K = sup |g′(x)|) if and only if it has a bounded first derivative"

Lipschitz continuity for differentiable functions is just having a bounded derivative. The Lipschitz property suddenly becomes uninteresting as it just falls out of the assumptions, the interesting fact which allows you to use non-differentiable functions is that not assuming differentiability, but assuming Lipschitz continuity, is enough.

>Using that bound is bread and butter in convex analysis of differentiable convex functions.

It also is the bread and butter in Analysis of PDEs, but it is the bread and butter because Lipschitz continuity is a weaker property than differentiability. In the context of the article you want to talk about non-differnetiable functions, e.g. max, which you couldn't if you assumed differentiability.

The reason this is important because choosing Lipschitz over differentiable is what makes all this work.


Yes we know that. I object to 'loses all meaning' when the function is differentiable. It certainly doesn't.

It gives the special property that the derivative is bounded, a property that's not true for arbitrary differentiable functions.


My point is that the article is using the Lipschitz property to get its results. This makes it unnecessary and even wrong to introduce Lipschitz continuity only for functions with a derivative. Especially since the article actually uses functions which do not have a derivative.


For what it's worth, what you originally said was: "This is total nonsense." The points you're making are valid, but it isn't "total nonsense". Something not being exactly factually correct doesn't mean that it's "total nonsense". Different publications adhere to different levels of rigor. Just because it doesn't meet your own personal standard doesn't make it nonsense for the target audience of the article.


Yes.

I think the author meant that the rate of change is bounded [without the function needing to be differentiable] but used the term "gradient" anyways. I guess you shouldn't use the term "rate of change" either? I guess

  The Lipschitz property means the function's  output doesn't change faster than linearly   with input
Is precise in that context


He even gives the definition for the L=1 Lipschitz condition afterwards.

Lipschitz continuity means the differential quotient is globally bounded (where it exists). Differentiability, by definition, means the differential quotient converges at every point.

These are somewhat related, but different things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: