Hacker News new | past | comments | ask | show | jobs | submit login
Logic Programming is Underrated (swannodette.github.io)
119 points by llambda on June 8, 2013 | hide | past | favorite | 27 comments



I think both are wrong ;-)

Even programming languages like Prolog needs an expression of the problem that is not exactly how we think in logical expressions. You can't directly translate a first order logic expression to Prolog, you need to add extra thinking before doing that.

On the other hand generic combinatorics libraries don't work with large sets.

Welcome to SMT (satisfiability modulo theories) where there are state of the art optimizations for specific fields. If you want to see something impressive look at how to solve Sudoku with Z3 (http://z3.codeplex.com/) , it's just expressing the problem in logic terms: http://rise4fun.com/Z3Py/tutorialcontent/guide#h210


In case anyone is looking for other novel ways to solve Sudoku. I wrote a CP language a while back that does pretty fast solving for the right kind of problem [1]. Folks might also find this Sudoku benchmarking blog post interesting [2], it includes links to solutions in a bunch of languages.

1) http://sabrlang.org/sudoku

2) http://attractivechaos.wordpress.com/2011/06/19/an-incomplet...


The SMT bit you mentioned shines a light in one of the gripes I have with logic programming: most often the examples people give are for the sort of AI search problem that might look nice when specified but that you can't hope to solve in practice with a generic depth-first-search.

That said, one thing that I always felt was unique about LP was how one of the primitive operations was data structure unification and how some pieces of code can be run in both "directions" if you are careful. However, I still haven't been able to find a neat example of what sort of problem really benefits from this.


At UBS I know they have a small business rules system written in core.logic, in their case I believe being able to run backwards allows them to uncover redundancies - like "which rules match this result".


I still haven't been able to find a neat example of what sort of problem really benefits from this.

I think one of the clear benefits is having a tool to aid you in your research. Speaking about chess, is like what Kasparov envisioned for advanced chess http://en.wikipedia.org/wiki/Advanced_Chess


One place where this has potential is for type checking & type inference.


I think this is also missing the point a bit. SMT solvers are great, but as far I can it's not really SMT vs. CP or LP, they have different strengths and weaknesses. As far as the sudoku example while neat is pretty much identical in approach and expressiveness to the core.logic version I linked to.


It's true that they are similar on their expressiveness approach and at the end core.logic could have an SMT making them identical. But when we add SMT we are talking about a different beast.

For example, imagine that you want to implement a professional chess software. One romantic approach would be to find the best and nicest algorithm(s) to do that. SMT, idealistically, adds some "ugliness" trying to reach to a conclusion using every heuristic available. It's an algorithm too but it doesn't end on a good expressiveness but in inserting every heuristic available inside the resolution process.


Smart programmers are underrated.

The reason logic programming is not widespread is 2 fold:

1) It is a new paradigm (just like functional, object oriented) are new paradigms. That makes it much harder to learn. It is not like going from Java to C#.

Look at this http://dtai.cs.kuleuven.be/ppcbook there are solutions to problems in Prolog. Solutions are amazingly short, and concise. They are beautiful in how succinct they are. But few programmers could produce them. It is just too hard to make incremental updates and debug logic programming.

2) It is not really that useful in real world as a general purpose language. And it is sure not for the lack of trying. It was supposed to be the 5th generation language in Japan, the future, money and brainpower was thrown at it and ... not much happened. People wanted to server data over the networks, render games, search databases, sort data, multiply matrices and in not too many of those domain logic programming screams as "this is the most obvious paradigm". Because it isn't.


I'd hardly call logic programming new. At 40+ years old, it's one of the oldest paradigms in computer science!


It is new for every person learning it. Every paradigm is a new paradigm for the person learning it. When someone learns logic programming usually they already learned structured or object oriented already. Then logic is _new_ for them and _them_ here is almost every programmer out there.

And the fact that is 40+ years old actually supports my point. It is so old yet it hasn't caught on yet. Maybe just maybe it is waiting for its time in the limelight and it hasn't come yet... Or is that Bananarama playing in the background, and my Sony Walkman is running out of batteries... ;-)


> I'd hardly call logic programming new. At 40+ years old, it's one of the oldest paradigms in computer science!

Isn't logic programming one of the newest programming paradigms out there at about 40 years of age?

There are examples of Imperative, Functional, Object-Oriented and other Declarative programming languages that predate the first logic programming systems from the early 1970's.

When I had a logic programming course at the uni a few years ago, my prof started with an anecdote. He had been teaching the logic and contraint programming course since the years before the "AI winter" of the 1980's, and when he first started it truly was the newest paradigm out there and it was the sexy new entrant to the field. 30 years have passed but no major paradigms have emerged (arguable) so he still begins his course the same way, 30 years later. But with a grain of salt, of course...


I have been playing around with Mercury[0] for a while now which is a quite practical logic/functional implementation. It has a very small user group but it's worth giving it a try.

[0] http://mercurylang.org


If we are discussing the benefits of logic programming, I think it's important to include other declarative languages like SQL.

The idea of specifying directly what is computed, and not how it is computed, is very useful, and not limited to abuse mathematical problems.


That's my hypothesis for why logic-programming's popularity faded somewhat from its initial promise. When it was new, it was one of the few widely available ways of doing declarative programming, and contrasted strongly with procedural programming. But in the years since, lots of other declarative approaches have chipped away at the monopoly of logic programming proper over declarative approaches to programming.

I wrote a bit about that here: http://www.kmjn.org/notes/prolog_lost_steam.html


I can't understand how you can be happy using ~84,000,000 clock cycles to solve that.

Logic programming is a wonderful tool to know about, and armed with that knowledge you can adapt your programming style to the problem in question. http://norvig.com/sudoku.html


I love that Norvig post and it's what inspired me to get the finite domain functionality into core.logic. I don't really understand your point though. Why solve finite domain problems by hand if a library does it for you? The sudoku problem can be solved in many fewer lines of code than Norvig's http://norvig.com/sudopy.shtml with a library and it's just as efficient if not more so, http://gist.github.com/swannodette/5736688


Look at my comment linking to the Z3 sudoku solver: https://news.ycombinator.com/item?id=5846589


The beauty of logic programming is that you can basically apply your problem solving skills very directly.

However some parts of say Prolog are a little unusual. The typical way of writing OR as multiple clauses directly violates DRY and I usually use the alternative IF-THEN-ELSE notation (->). I also use logical loops (basically foreach)...at the end of the day it's kind of a paradigm mix (arithmetic and write sideffects, too).

Grammars and constraints are pretty fun though. It's a very good tool to have in your toolbox. Prolog is certainly the language that expanded my mind the most after having worked with JAVA/Python before.


"The attractiveness of logic programming, when it was first launched, was that it had a ready-made declarative reading, which was expected to make a big difference for programming. Unfortunately, Prolog also had a bad procedural reading that people struggled with. So, in practice, Prolog programmers spent almost all of their effort on the procedural meaning, and the declarative meaning went out of the window. In the end, Prolog was a good dream, but a bad reality. Coming to classical logic per se, I believe it is ill-fit for describing computer programs or "processes" as you call them. First of all, classical logic doesn't have any well- developed notion of time or dynamics, and it has a nasty existential quantifier which assumes that all things exist once and for all. In computer systems, new things come into being all the time, well beyond the time of the original development or deployment. We know how to write computer programs that deal with that. Classical logic doesn't. (LEJ Brouwer made this criticism a long time ago in the context of mathematics. There it might have been just a philosophical problem. But in Computer Science, it is a practical problem.)

I believe logicians of philosophical inclination are prone to be enamored with what they have and lose sight of what they don't have. For a good part of two millennia, they kept debating Aristotilean syllogisms, without realizing that classical logic was yet to be discovered. Finally, it was left to the mathematicians to formulate classical logic. The logicians of today are similarly enamored with classical logic without much of an understanding of what it lacks. We would be ill-advised to listen to them. Or, we would be stuck for another two millennia, God forbid. [By the way, the Wikipedia page on Classical Logic is in a pitiful state. I hope somebody will pay attention to it.]

Brilliant new things are happening in Logic. - Mathematicians have formulated Toposes (a generalization of Kripke models), which give us a great new variety of models for intuitionistic logic. There are deep mathematical facts buried in them and continue to be discovered. Toposes and intuitionistic logic are appropriate for modeling computer programs, which live in a growing dynamic world rather than a static one. - Girard has formulated Linear Logic, which broadens our idea of what kind of "things" a logic can talk about. David Pym and Peter O'Hearn invented Bunched Implication Logic, extending Linear Logic with a beautiful model-theoretic basis. These logics applied to imperative programming (which go by the name of "Separation Logic") are revolutionizing the development of technology for imperative programs. It is time to leave behind the classical logic. In fact, we should have done it a long time ago."

- Prof Uday Reddy on the types mailing list.

http://lists.seas.upenn.edu/pipermail/types-list/2013/001684...


The rest of that discussion thread is fairly interesting as well: http://www.cs.utexas.edu/users/vl/tag/declarative

Fwiw, answer-set programming is a logic-programming language with a Prolog-derived syntax that has only a declarative reading, and no procedural reading.


Dataflow programming is underrated.

A few years ago, I had to help our EEs write some testing software in LabVIEW. The first thing that shocked me was the system was immune to bad data. Secondly, how elegantly it utilized multiprocessing and multithreading hardware.

I wish researchers would work on graphical, signal-based, synchronous languages like this. LabVIEW gracefully solves some of the biggest issues with functional and object oriented languages. Why hasn't more work been done in this area?

Has anyone else had a similar experience?


I think it has s lot to offer also. Came across this recently. http://bergie.iki.fi/blog/noflo-two-years/

Check out also the original FBP the article refers to. It seems they had a lot of success with it on the server side of enterprise. seems like it could be ideal for web back end stuff


Having taught logic programming a few times, I have come to the conclusion that it is a worthwhile thing, but it should not be thought of primarily as a programming paradigm. Rather, it is a technique that a programmer should be able to make use of within the framework of a larger program.

An analogy: a good programmer should be able to write a state machine. And these are occasionally useful. However, just because a state machine plus a large bidirectional sequential-access data store can do any computation whatsoever[1], does not mean that it is a good idea to write all our programs this way.

Similarly, logic programming and the associated unification operation make up a Turing-complete system, but ... so what? I think that logic programming ought to be available as a library in every programming language. But I don't see the point in making it the only functionality there is.

And, of course, core.logic is an attempt at doing just this for Clojure. (And then there is Pyke ....) However, as this article -- along with the article it references -- notes, we have not really figured out how to integrate logic programming with more mainstream programming methods. This, I think, is a problem well worth solving.

[1] Think "Turing Machine".


I don't know much about Clojure other than "it's LISPY, Hickey is a cool cat, build on top of the JVM"

Since it's build on the JVM wouldn't it be easier to use one of the existing Prologs and their Java bridges/connectors?

SWI has JPL, GNU/Prolog has something iirc and commerical ones like Sicstus also provide these (Jasper for Sicstus).


core.logic is extensible in a way that Prologs traditionally are not and it integrates quite deeply with Clojure - for example you can extend unification to your own data structures and stream "facts" from your own data sources.


'Logic Variables' and unification provide a lot of value, even without predicate databases and backtracking/search. Peter Van Roy's book goes into some detail.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: