Hacker News new | past | comments | ask | show | jobs | submit login
Clifford Algebra: A visual introduction (2014) (slehar.wordpress.com)
136 points by MichaelAO on June 19, 2015 | hide | past | favorite | 40 comments



I've played with GA for a while, bought a couple of books, enjoyed the math, experimented with writing my own library. But, as TTProgram's comment, I've yet to find an application where it clearly improves on high-school vectors.

I'm aware that Geomerics purportedly used it for doing global illumination calculations, but there was scant specifics (understandably) about the actual implementation. So we don't know in what capacity it was used.

I've heard it praised for physics applications, but I haven't seen anyone walk through, say, a simple rigid body simulator and show its improvement.

I've heard it spoken of as the future of ray tracing, but again I've not seen anyone willing to put an implementation (even a toy) up against a ray tracer that you could write in an hour with Gibb's vectors.

On the contrary, when I've tried to use it, the complexity of GA just makes it impractical. The problem with 4 or 5 dimensional GA for doing work in 3D, is that we end up dealing with 16 or 32 values per object (if you unify vectors, bivectors, etc), or else having five or six specialised forms of object plus their combinations (if you keep them separate), with potentially hundreds of specialisations of basic operation. This isn't just me, the GA libraries around have this bloat. I've just not found an application in which that massive increase in storage, computation and implementation complexity is worth it.

Which makes sense. In a regular vector based system, we might use vectors and quaternions (2 things) to model rigid bodies. We can do a lot with those. But it isn't clear what thing is difficult or inelegant to model with them, that a trivector/scalar would be the right tool for. So that extra flexibility to be able to store such exotic mathematical objects is a waste for computer and programmer.

I came to the conclusion that GA-advocacy at the moment is mostly a true-believer type thing. True believers will tell you over and over how important it is to modelling 3d space, but rarely show you a worked example of it being easier. I am not cynical enough to think it has no killer applications, but I've yet to find them. And I would dearly love to.


I do agree that GA is overhyped, but it's definitely not useless. GA brings many separate concepts into a common framework. He's a couple of things that are just ugly:

- Cross product only works in 3d

- Why is it that the cross product seems mysteriously related to determinants?

- Multivariate calculus as typically taught is very ugly

- Complex numbers for 2d, quaternions for 3d, then what?

- Rotations/reflections using linear algebra

- Cauchy's integral formula, Stokes' theorem, Helmholz decomposition, etc.

- Etcetera

GA just makes all of that much nicer and brings it together into a coherent framework.

Here's a simple problem that illustrates why GA is nice: given 3 points A,B,C in 2-d space with integer coordinates, determine whether B is on the line segment AC. Now do it in 3-d, and then in n-d.


The real problem is that it takes quite a bit of practical problem-solving experience to get used to a mathematical language, months or years of work, to really become fluent with it. The same is equally true of trigonometry, vectors, complex analysis, quaternions, calculus, matrices, spinors, group theory, differential forms, manifolds, etc. etc.

But with a mainstream bit of technology, you can go to any good university math department and sign up for a course on the subject. You can take your pick of 100 textbooks. You can read resources aimed at applying specific tools to any number of concrete practical uses. You can often assume that people with a math, science, or engineering degree are already familiar with the technology and understand how it works. You can go find open source implementations of tools built using that technology. Etc.

As a result, we don’t remember how hard all those tools were to learn, and how much work went into building up the conceptual ecosystem.

With a non-mainstream bit of technology, you get to choose from a small handful of resources, often aimed at an audience you aren’t properly a member of or aimed at uses you don’t care about and missing parts that you need. Often best approaches for solving particular problems haven’t been worked out yet. From one source to another, notation conventions and choices of model details can vary, making it hard to compare sources, notice errors, or communicate ideas. Concrete code implementations have seen little scrutiny and so are likely to be incomplete, inefficient, and buggy.

This is a problem with any difficult new bit of technology. Sometimes it takes decades or even centuries for a piece of mathematical technology to go from introduction to wide adoption. Sometimes they don’t make it or sit abandoned on the shelf for a while.

As far as I can tell mathematicians haven’t adopted Geometric Algebra for a simple reason: Mathematicians make their living proving new theorems, and GA is equally/less powerful (in a theoretical sense) than existing abstractions they have, so as a result isn’t particularly useful for their proofs. Its advantages are as a sort of “refactoring” of a bunch of existing technology, and it’s usually a career limiting move in any field to spend too much of your time refactoring, however advantageous it might be for humanity in the long run.

If you’re interested in physics applications, you should check out Hestenes’s book New Foundations for Classical Mechanics, which is pretty good. I’m particularly impressed with the completeness of the treatment of rotations and rigid body mechanics in chapters 5 and 7. General relativity is a field where Geometric Algebra already sees a fair amount of use in the academic literature. Hestenes has also written papers which sketch promising approaches to electrodynamics, quantum mechanics, etc., and the guys at Cambridge have been working on those a bit, but it’ll probably take a large group of others to really flesh it out.

> On the contrary, when I've tried to use it, the complexity of GA just makes it impractical. The problem with 4 or 5 dimensional GA for doing work in 3D, is that we end up dealing with 16 or 32 values per object (if you unify vectors, bivectors, etc), or else having five or six specialised forms of object plus their combinations (if you keep them separate), with potentially hundreds of specialisations of basic operation. This isn't just me, the GA libraries around have this bloat.

This is two tricky problems:

(1) In most cases Geometric Algebra solutions to problems don’t essentially require more computation than other approaches, but figuring out how to write efficient practical implementations (especially against our current hardware and software which is heavily optimized for solving matrix problems) is non-trivial and hasn’t really seen that much work yet.

(2) Existing programming languages really aren’t optimized syntactically for interacting with multidimensional objects in a natural way. Trying to cram a GA system into C++ or Java (e.g.) is just asking for a bloated and confusing syntactic nightmare. You see the same problem with trying to write code for doing matrix algebra in those languages, but at least we have a lot of experience with the domain and there are languages like Matlab or Julia or libraries like NumPy that make things much nicer. Nothing like those exist for Geometric Algebra, and it’s a non-trivial problem to solve.


I agree on all fronts. I think GA is a poor choice for practical programming for the same reasons you identify: nobody knows how to do it well, tools / architectures are not made for it, and nobody has publicly demonstrated its utility on a practical problem in my domain yet (I intended to, and failed).

The second issue, about performance, makes me slightly skeptical of the 'made with GA' claims of Geomeric's engine. I don't doubt there's a GA system in there doing something, but I wonder how much of the actual hard problems in global illumination it is handling, given that global illumination is primarily compute-bound.

As I said, I'd like to be proven wrong. The math does have nice features in the abstract (as you point out).


By the way, it’s totally off topic (I’d email but you don’t have an email listed in your HN profile), but on the subject of new mathematical technologies with practical application, I’ve really been enjoying learning about this thing for the last month or two:

http://www.chebfun.org

Here are a couple of hour-long lecture videos https://www.youtube.com/watch?v=cQp-kB9EOQs http://downloads.sms.cam.ac.uk/1160831/1160835.mp4 and the first 6 chapters of a book explaining the basics of the underlying theory are freely available online (the rest of the book is great too) http://www.chebfun.org/ATAP/

Basic concept is to approximate functions as high-degree polynomials, to within machine precision, and then do operations on those approximations instead of on the original functions, at each step chucking out any coefficients which are smaller than machine precision. This in many cases provides a dramatic speed boost over existing methods for taking integrals/derivatives, finding roots, solving differential equations, convolution, etc. etc., while staying extremely accurate, much better than discrete solutions.

It’s not universally applicable to all functions (for obvious reasons) or all problems, but the team responsible has really done an incredible amount of solid work on the project.

Apparently it’s 10+ years old now, but nobody I’ve asked has ever heard of it before. Which is kind of surprising, because it seems amazing to me so far.

Unfortunately stuck as a Matlab library for the moment (there are partial ports to NumPy and Julia but they are very feature poor compared to the Matlab version).


That sound? That was my weekend going poof! Thank you.

A worthwhile OT. Maybe you could submit it as a link?


It was submitted as a link a few times 2–4 years ago but got no traction: few votes and no comments https://news.ycombinator.com/item?id=3350754 https://news.ycombinator.com/item?id=9463006 https://news.ycombinator.com/item?id=6352216 https://news.ycombinator.com/item?id=5324400

I want to write some examples (as long blog posts or similar) in a few concrete science fields showing how to use these tools in practice and submit those. I feel like the chebfun guys haven’t really figured out some good introductory hooks yet, or done much marketing/outreach: most of their output is papers about approximation theory which takes some amount of effort to figure out and only get seen by academics working in the same subfield, and the practical examples on their site are pretty small pieces (and don’t have comparisons vs. alternative approaches) so it’s hard to see the big picture. Those lecture videos are pretty good, but an hour long lecture is a somewhat big commitment and hard to skim.

BTW, I’m serious about the email thing: shoot me a note (address in my profile) and I’d love to hear about your attempts to implement Geometric Algebra in code. One of my goals for the next few years is to at some point sit down and write a nice web-based introduction to GA filled with interactive diagrams.


I happened upon this presentation just a couple days ago - compile-time geometric algebra in C++ via metaprogramming. https://www.youtube.com/watch?v=W4p-e-g37tg

Not sure how useful it is in day-to-day reality, but at the very least it might make other GA implementations seem simple by comparison.


Compile time is the most efficient way (IMO) to deal with all the types that can appear (also in intermediate steps of a calculation). I found this idea first realized in this implementation: https://github.com/JaapSuter/Vital. According to the comments in the code is from 2004 so it did could not make use of C++0x, which is why I adopted it rewrote it for C++11. This was motivated by an actual problem I had to solve but, sadly, it turned out GA isn't fit for it.


I think GA allows for fewer symbols to express similar mathematical ideas, just as J or APL allow for more concise programs. The whole 'Notation as a Tool of Thought' thing by Ken Iverson. I believe new notations require time to sink in to usage for thinking, and then implemented in to the things of the world. We are heading towards concurrent and parallel hardware, distributed systems, GPUs, array languages, and vector processing machines. Maybe a GA implementation will work at a more basic level when we step away from the archaic hardware architecture we have grown so accustomed to and speak GA at a more native level. Lisp machines that speak native GA! Call me a convert or a 'gusher'. I like the beauty of the GA as written, although I plead ignorance to any of the attempts to render it in today's software/hardware. I did read of a company trying to embed it into a graphics processor chip vs. a software library. I'll have to find the reference again.

[EDIT] Here's the link for the chip: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6...


GA is for comprehension, not computation. It is like a high level functional language, compared to Fortran. Compare how hard it is to compute complicated matrix arithmetic in haskell vs fortran, vs describing symmetries in same.


See also https://en.wikipedia.org/wiki/David_Hestenes#Work

"Hestenes ... is the prime mover behind the contemporary resurgence of interest in geometric algebras and in other offshoots of Clifford algebras as ways of formalizing theoretical physics."


Interestingly enough, his first book, Space Time Algebra, has just been republished in a revised 2nd edition.

Description from ABE books:

Item Description: Gordon Breach Publishing Group, Switzerland, 2015. Hardback. Book Condition: New. 2nd Revised edition. 235 x 155 mm. Language: English Brand New Book. This small book started a profound revolution in the development of mathematical physics, one which has reached many working physicists already, and which stands poised to bring about far-reaching change in the future. At its heart is the use of Clifford algebra to unify otherwise disparate mathematical languages, particularly those of spinors, quaternions, tensors and differential forms. It provides a unified approach covering all these areas and thus leads to a very efficient ‘toolkit’ for use in physical problems including quantum mechanics, classical mechanics, electromagnetism and relativity (both special and general) – only one mathematical system needs to be learned and understood, and one can use it at levels which extend right through to current research topics in each of these areas. These same techniques, in the form of the ‘Geometric Algebra’, can be applied in many areas of engineering, robotics and computer science, with no changes necessary – it is the same underlying mathematics, and enables physicists to understand topics in engineering, and engineers to understand topics in physics (including aspects in frontier areas), in a way which no other single mathematical system could hope to make possible. There is another aspect to Geometric Algebra, which is less tangible, and goes beyond questions of mathematical power and range. This is the remarkable insight it gives to physical problems, and the way it constantly suggests new features of the physics itself, not just the mathematics. Examples of this are peppered throughout ‘Space-Time Algebra’, despite its short length, and some of them are effectively still research topics for the future. From the Foreward by Anthony Lasenby.


Would love to read Scott Aaronson on Clifford Algebras (and computation, of course).

There are a lot of pretty pictures in this article but I found it to be almost completely opaque. This is a subject that deserves a book, a class, a teacher.


Okay, I spoke too soon (and maybe a little unkindly). If you skip to "Historical Summary" it's pretty darn good actually.


My understanding is that all of this (wedge/alternating products, etc.) is considered a part of linear algebra, and has been established that way for quite a long time. It's hardly the first idea of translating between geometry and algebra (there's a whole field called "algebraic geometry" going back to antiquity, come on!), and it's not the "universal language of mathematics" by a long shot.

> This entire analysis of Clifford Algebra was based on my own foundational assumption that mathematics is not a human invention, but more of a discovery of the essential principles of computation in the brain.

And now I'm suspicious. I prefer an objective mathematical treatment and not one with some self-described "radical" agenda.


http://www.av8n.com/physics/clifford-intro.htm is a much better source. It looks like OP read av8n and that translated it into some spiritual interpretation.


After reading this I am thoroughly convinced that this is standard material for introductory graduate algebra. And they do it in far less space and with far less philosophy.


The equations of geometric algebra can be represented using matrices (perhaps with a non-euclidean metric for the inner product depending on application), so in that sense you don't gain the ability to do anything that you couldn't do with regular linear algebra.

The "big wins" are that the notation is simpler and easier to reason about, and everything is coordinate-free. As an example, there is a model of 3 dimensional euclidean space that can be represented using a 5-dimensional GA (with 4,1 metric signature). Within this model, you have lines, planes, circles, spheres, etc; but you also have rotations, reflections, translations; and all of these things are just standard elements of the mathematics with no special cases. So you can multiply a translation and a rotation and you get back the composition of those two things. Or you can multiply a sphere by a translation and you get back a sphere that has been translated by the specified amount. Or if you want to smoothly interpolate a rigid body motion (translation+rotation) between two points, you can take the logarithm of that motion. And again, this can all be done symbolically without resorting to coordinates - except at the very end, perhaps, if you want to map it onto an actual coordinate system :)

In terms of people applying it to more "advanced" subjects like Minkowski space or differential geometry - I'm not so sure. I haven't spent too much time looking into that.


I've been exposed to geometric algebra a few times, but for me to be motivated to dig into it I would need to see a few examples of how it can drastically simplify certain calculations. Otherwise there's just so much math out there to learn with more common direct applications that it's hard to justify the effort.

For example, is there a way to use geometric algebra to better handle tensor manipulation (decomposition etc.)? That all gets pretty messy once you leave matrices and vectors.


You could say "Convince me" about any branch of mathematics.

But IMHO this is different.

http://c2.com/cgi/wiki?GeometricAlgebra


I just posted Hestenes Oersted lecture 2002 to HN:

https://news.ycombinator.com/item?id=9749356


You SHOULD say "Convince me" about any branch of mathematics. There's more things to study than time in a lifetime allows.


What's up with the kooky Penrose-esque metaphysics of the human mind at the end there? Lehar has taken the elegance of geometriv algebra to a bizarre spiritual plane (er, bivector?)


Fleshing out the psychological argument regarding curved space that the author discusses, is Heelan's "Space Perception and the Philosophy of Science" (1988).


The post makes it appear that using cross product to express Faraday's law is somehow related to Clifford Algebras.


The cross-product is a useful outer product, but it's only possible in three dimensions: It takes two vectors and makes a new vector orthogonal to both of them.

In two dimensions, that's impossible, because there's no other direction for that new vector to point. In more than three dimensions, that's impossible, because there's no single direction for that new vector to point.

It's a hacky way to express the more general concept of orthogonality and handedness: The magnitude of the product we want is proportional to how orthogonal the two vectors are, and the sign is due to a handedness convention we've adopted.

Well, if two vectors are orthogonal to any degree at all, they span a two-dimensional surface, so why not define a product which takes two one-dimensional vectors and makes a signed two-dimensional surface out of them, with an area proportional to their lengths and how orthogonal they are and a sign due to a handedness convention? That's the wedge product in Clifford algebra.

The fun comes when you realize that these signed areas are quaternions and can be used to rotate vectors in arbitrary-dimensional spaces simply by picking vectors which span the desired plane of rotation and choosing apt multiplication rules.

The real fun comes when you realize that choosing your metric cleverly gives you one plane where rotations are hyperbolic instead of Euclidean (circular), so you can model accelerations as rotations in the space-time plane. Which is Special Relativity. Which is neat.

This lays it all out nicely with pictures and everything: http://www.av8n.com/physics/clifford-intro.htm

And here is SR done as hyperbolic rotations in a space-time plane: http://www.av8n.com/physics/spacetime-welcome.htm


There is actually a 7-dimensional "cross-product" (-esque operation): https://en.wikipedia.org/?title=Seven-dimensional_cross_prod...

The relationship between Quaternions and the 3-dimensional cross-product corresponds to the relationship between the octonions and this 7-dimensional cross-product. Normed Division Algebras are a fascinating and worthwhile subject to look into.


Av8n is s much better source for this material. slehar took pictures out of context from av8n and wikipedia, and slathered a layer of woo on top.


Does it? It looks to me as if it's saying that Clifford algebras give a better way of writing Maxwell's equations (all four of them) as a single equation with no cross products (as such) at all.

You combine the current j and charge density rho into a single four-vector current J; you combine the electric field E and magnetic field B into a single bivector F. (In spacetime, not just space, so the space of bivectors has dimension (4 choose 2) = 6 = 3+3, which is the right amount for representing E and B.) Then the Clifford-algebra analogue of the "del" operator from conventional vector calculus does div-like things and curl-like things in the right combination to encode all of the Maxwell equations.

Of course, you can do more or less the same thing without calling it Clifford algebra (or geometric algebra) and you can find it done in physics textbooks. Usually this means a whole lot of tensor formulae; see e.g. https://en.wikipedia.org/wiki/Covariant_formulation_of_class.... Geometric algebra gives a nice framework for writing it all down, though.


I apologize, my comment was too vague. My complaint is that when the connection between the Maxwell equations and Clifford algebras is first raised, there is an accompanying picture that shows the cross product in explicit coordinates.


Oh, yes, I see, and I agree: that's a completely inappropriate and misleading picture.


Interesting article that conveyed something new, so it was a worthwhile read. But did anyone else feel that they had to filter out the gushing / "crankish" bits in there? Particularly the gripes about the impossibility of inversion.


If I want to compute a derivative of a function expressed on a clifford algebra, does that make sense and is it possible?



In the end the Clifford algebra is just a vector space, so that shouldn't be a problem. However it might become problematic that the dimension of the Clifford algebra is 2^n, where n is the dimension of the original space.


Very nice presentation! I do wish this was taught in schools instead of the dreadful Gibb's vector algebra.


Can someone comment on what you gain by choosing Clifford Algebras over Exterior Algebras?


Division.


our whole system has it internally. very very useful for least squares systems and covariance propagation, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: