Hacker News new | past | comments | ask | show | jobs | submit login
Who killed Prolog? (vanemden.wordpress.com)
93 points by adamo on Aug 31, 2010 | hide | past | favorite | 55 comments



Prolog isn't dead, it just doesn't seem to get a whole lot of attention these days. I use SWI-Prolog for prototyping and exploratory programming, and I didn't need a shovel and a lantern to install it.

I don't think Prolog is practical as a fully standalone language, though. Like SQL, it has a model that strongly skews it towards certain kinds of problems, and which makes I/O and side effects awkward. I think it would be best to have a Prolog-like language as a library (like Lua or SQLite), to embed it for rules & scripting. (Writing one is on my project TODO list. I've been reading quite a bit about Prolog implementation.)

To me, the single most interesting aspect of Prolog is computing with partial information via logic variables: variables that are constant, but don't necessarily have to be bound immediately. Conceptually, the variables always have some specific value, but you didn't know until they were bound. You can pass around structures with holes in them to use as templates in pattern matching, append lists by binding the "hole" at the end of a list to another's head ("difference lists"), etc., while still keeping the advantages of immutability. Most languages descended from Prolog (Mercury, Erlang) drop logic variables, though Oz keeps them.

Another kind of computing with partial information is constraint programming, which makes Prolog much, much less dumb. Generate-and-test is great for prototyping, but quickly shows its limits with combinatorial problems - where constraint programming excels. Most major Prolog implementations come with constraint programming extensions, and they address many of its weak points.

There's also Erlang. Erlang seems like a very modern, practical evolution of the declarative + concurrent direction that some logic programming research was going. If you remove backtracking (which makes handling concurrency far more difficult), then I/O also becomes feasible. Erlang keeps much of Prolog's declarative-ness, and adds several powerful features of its own.


Re: Prolog implementation, in case you haven't already found it, the book Warren's Abstract Machine: A Tutorial Reconstruction, by Hassan Aït-Kaci, is pure gold for understanding the WAM.

http://web.archive.org/web/20030213072337/http://www.vanx.or...


Yes - I have a printed copy of that, actually. It's not the end of Prolog evolution, but it's a good snapshot of common implementation techniques, and it definitely clarified things.

Here's a (rather snarky) LtU comment by Peter Van Roy pointing to other designs improving on the WAM - http://lambda-the-ultimate.org/classic/message1618.html#1108...



  > I think it would be best to have a Prolog-like language 
  > as a library (like Lua or SQLite), to embed it for rules 
  > & scripting. 
You may be interested to know that there are various prolog Perl modules on CPAN, particularly AI::Prolog and Language::Prolog::Yaswi .


Perl isn't really my thing* , but I'll take a look at how they're implemented. I was thinking a C library with a C API that will make it easy to use from Lua, Python, Ruby, etc. Yaswi is probably just a wrapper to SWI-Prolog. There are a few of those, and I'm tempted to write one for Lua.

I think designing a Prolog-like language primarily for embedding, rather than wrapping a standalone Prolog, will have an interesting effect on the design though - Lua, for example, benefits tremendously from being able to leave certain issues to C. An embedded logic language could likewise focus on what it does best.

* Somebody is probably going to pop out of the woodwork and say that my impression of Perl is based on Perl 10 years ago. They're right, as that's roughly when I switched to Python. It'd be tough to convince me to switch back, though. I'm fine with awk for "tiny Perl script" hacks, and prefer other languages for larger stuff.


Check out js-prolog: http://ioctl.org/logic/jsprolog-history

It's a quick-and-dirty implementation of a Prolog interpreter, but it's a great resource for prototyping. JavaScript ports very well to Lua, so you should be able to port the code in a busy afternoon.

I made a couple of suggestions to increase the speed of the interpretation, but I'm not sure Jan ever integrated them into the code. The big one was using a trace to destructively modify environments rather than recreating them on choice points; WAM and tabling would be great, but it's a non-trivial piece of work following Warren's tech report, even with the help of the (very good) tutorial.


Thanks! That looks fun.

I'm planning on doing a C VM-based implementation with tabling* . I'm not sure about coroutines, and the VM will not necessarily follow the WAM design verbatim - Peter Van Roy has several suggestions for improvements in his thesis.

* Prolog-ese for memoization


The nicest embeddable Prologs I've found are, for whatever reason, in Java. I use Tuprolog (http://alice.unibo.it/xwiki/bin/view/Tuprolog/) a bit, which embeds a Prolog interpreter pretty cleanly. There's also a compiler, PrologCafe (http://kaminari.scitec.kobe-u.ac.jp/PrologCafe/), though I find it a little harder to use seamlessly.

I agree that something as easy to embed from C or languages with C-embedding hooks would be nice.


It's odd that there isn't a C one, really. I've done very little with Java and the JVM, but hopefully those will prove helpful to someone else.

I don't have a time frame for a C-embedded Prolog (I've already got several other projects I'm trying to finish), but it's high on my list.


I saw this one pass by a few weeks ago, it may interest you:

http://yieldprolog.sourceforge.net/

"Yield Prolog lets you embed Prolog programs directly in Python, C# [1] or Javascript [2] by using the yield keyword."


If it's implemented in C with a simple API, it should be easy to use from anything that has a C FFI. I'd rather not make it too language-specific - there's a huge duplication of effort as new languages appear.

Lua and SQLite are both great examples of libraries whose APIs make them straightforward to embed in other languages.


About Prolog-as-a-library, you might like to see how it was done for Lisp at http://norvig.com/paip/README.html

I guess for C the API would need to be very different like Lua's C API, but that's life.


I've got that book, too. :) (And I recommend it to people all the time...)

I'm thinking something like Lua's C API, both because I know it well and because its design does a particularly good job of bridging C and a dynamically-typed language.


What people wanted Prolog to do, it couldn't do. People wanted to step away from the control flow specification of imperative language, the message-passing of OOP, and let the compiler solve your problem for you by feeding it a specification of the constraints the completed system would follow. Prolog did not, and cannot do this, because it is not strong AGI. When people realized this, they gave up on it, because it is easier to tell the computer how it should branch than to forcefully guide stupid heuristics around.

When there exists a large, universal library of "common sense", of the kind people build up from years of childhood sensory experience, declarative programming will come back. But not until then.


Very much true. During my education I was massively disappointed in Prolog, because the programming process was something like this:

1) write it declaratively 2) figure out that your program will take days to complete 3) add ugly procedural hacks (cuts) to make it more efficient

To an extent, you have this process in any language (write->measure->optimize) but typically not in a way that it forces you at gunpoint to rape the paradigm you're working in.


An excellent point - Prolog fans (usually those who never actually programmed anything) would describe it as a declarative language.

However, any "real" Prolog program that I ever saw was really using it as a slightly odd procedural language.


It doesn't fail on all occasions. Our research group wrote a robust natural language parsing system written in Prolog (with some C extensions). The (aptly named) unification grammar, lexicon, and most of the productive lexicon are written in a declarative manner.

This really paid off when in a new research project, which goal is to write a sentence realizer for the same system. With nearly no modifications we could reuse the grammar, lexicon, and productive lexicon. Both the parser and the sentence realizer use the same grammar and lexicon now.

I understand that this may seem somewhat trivial, as it may seem that the lexicon and grammar are plain data. However this is not true:

- The grammar is written as a declarative manner, where goals are mostly operators that manipulate attribute-value structures. These rules are later compiled to plain Prolog terms via term expansion (DCG-like) for efficiency. - You don't want to perform some unifications immediately, even when two terms are unified. Most Prolog implementations offer blocked goals, where a goal is blocked until a variable becomes instantiated.

However, I am the first to admit that Prolog makes some classes of problems trivial (unification grammars, parsers). There are also many things that you do not want to do in Prolog, because it is a waste of time, or very inefficient. For instance, in our system the following components are implemented in C or C++:

* Finite state automata for quick lookup of subcategorization frames.

* Part-of-speech tagger for restricting the number of frames for each word before parsing.

* N-gram models that are used as a feature in fluency estimation.

* Tokenization transducer.

* Bit arrays (comparable to Bloom filters) for excluding useless paths in parsing.

Conclusion: use the right tool for the job. Unification, structure sharing, and (some) pattern matching are cheap and easy to use in (WAM) Prolog. Most other things are prohibitively expensive and clumsy in Prolog.


Most programmers have a hard time learning to think in the Prolog way. That's why there is so much bad Prolog code around. But the same can be said of Lisp code.


Several Prologs have extensions for constraint programming, which give it much smarter hueristics. Instead of trying every possible combination of choices and failing/backtracking as early as possible (generate-and-test), it can use constraints to narrow the intervals / sets of possible choices - which usually sets off a chain reaction and further constrains the search space.

Once constraint propagation can't narrow things any further, it can copy the search space, use various search hueristics (such as splitting each copy of the narrowest interval in half), and see if that triggers further constraint propagation (or hits a dead end). It usually greatly reduces the amount of depth-first search. There are other ways of implementing constraints besides propagator-networks and space copying, but that's the way I understand best.

Generate-and-test is great for prototyping, but doesn't scale up to larger problems. Constraint programming fares much better.


You mean something like Cyc or Open Cyc? http://www.cyc.com/opencyc/

I am not sure that I fully agree. Even without that, Prolog is very useful for many problems.


Cyc failed because it succeeded.

Doug Lenat was able to run a very good business (hiring about 50 people for 10+ years) running about 50% off government contracts, about 50% off sales to big companies that could use better KB management tools that his competitors had.

Cyc didn't need to change the world in order to succeed, so it didn't change the world.

That said, the fundamental trouble in NLP is the lack of common-sense knowledge. The proper word-sensing of pen in "the pig is in the pen" vs "the ink is in the pen" is a matter of semantics, not syntax. You can do Noam Chomsky stuff until you're blue in the face and it will get you nowhere... But then some stupid Markov Chain comes along that conflates syntax and semantics and beats it.


I'm not sure I'm convinced by the thesis Prolog was killed by association with a Japanese project that was a failure. I thought Prolog died a natural death, for several reasons:

Firstly, Prolog is it was great at the deductive, expert system type of AI, but in 90's, a new generation of AI based on statistics (firstly, fuzzy logic, and then proper statistical reasoning, particularly Bayesian approaches) appeared and showed that truth and failure just don't cut it any more.

Secondly, I don't know if anyone has done the research, but I wouldn't be surprised if Blub programmers are just better at thinking imperatively; Prolog tends to kind of warp the brain away from that, which would put off Blub programmers, but it doesn't provide the kind of killer power that makes expert programmers want to use it either.

Thirdly, Prolog is hard to optimise well, and hard to predict the performance characteristics of. In the 80's and 90's, where apps were getting written in C, this was killer.


"Firstly, Prolog is it was great at the deductive, expert system type of AI, but in 90's, a new generation of AI based on statistics (firstly, fuzzy logic, and then proper statistical reasoning, particularly Bayesian approaches) appeared and showed that truth and failure just don't cut it any more."

There are attempts to reconcile the two approaches, such as Markov Logic.

http://en.wikipedia.org/wiki/Markov_logic_network

I believe Statistical Relational Learning is the more general term for this idea.

http://en.wikipedia.org/wiki/Statistical_relational_learning


That was a fun read, especially the history of Admiral Inman setting up MCC. He was on the board of directors for my company and for years I had to pitch to him and a small committee to get IR&D funds. I also got a brief tour of MCC from him when it was getting set up. He was in a really good mood because he had just arranged the hire of Douglas Lenat.

As mentioned in the article, Feigenbaum's and McCorduck's book on the Japanese 5th generation project had caused quite a stir in the USA, and I think that it made it easier for me personally to get IR&D funding for anything AI-ish.

Perhaps off topic: I view Prolog as almost a scripting language because most programs I wrote in Prolog were short and solved one little problem. The only long Prolog program I every wrote was a quick one week rewrite of a 5 week IR&D project that was a prototype battlefield simulator that I wrote in Lisp. I can't imagine writing a million line system in Prolog.


> I view Prolog as almost a scripting language

Me too. I want to take this idea and run with it - it would probably work better as an embedded scripting language, like Lua or Tcl.


One of the first things that the FGCP did was discover that Prolog wasn't parallelizable the way they had hoped: very quickly they switched to a language called KL1 (very different from KL-ONE) that was parallelizable.

I'm amused at the general supposition that the FGCP was a failure. Yes, it failed to produce specific hardware that was commercially competitive, but the basic program of "applying parallelism to search and database tasks" behind it was the basis for Deep Blue, Google's web search and the infrastructure behind many a large scale system.


One missing bit, I think, is that the declarative-programming alternative that Prolog represented got somewhat incorporated, piecemeal and in non-logic form, into a lot of mainstream languages and frameworks, which reduced its uniqueness. For data-heavy applications, SQL gradually moved up the application stack from where it had once been, to where more and more "logic" was written declaratively in SQL, instead of it being a purely dumb butler of a language whose job was getting things out of cells for you. Parallel to that, SQL's also been gradually incorporating features from Prolog's descendent Datalog that push it towards a more full-featured language in which arbitrary logic can be encoded (recursive queries are one big 1990s addition).

Elsewhere, even in AI, production-rule systems gained popularity from the 1980s as a simplified, propositional form of declarative programming, which also lent itself more naturally to state maintenance (asserting/retracting facts, and updating what changes as a result, as the core interaction loop). The focus on expert systems in the 1980s, and the development of the efficient RETE algorithm in 1983 were probably some factors in taking over an area in that in the 1970s one would've done in Prolog, or hand-coded in Lisp. Even today lots of this kind of "logic" stuff gets done in Jess or Drools, especially in business-logic, which is something of a success for declarative programming, though not for full-on logic programming.

I think a bunch of this kind of thing conspired to make Prolog not nearly as exciting even by the late 1980s as it was in the 70s. Academics moved on, and started focusing on other things. Datalog and other data-querying systems were a big focus for a while: if Prolog wasn't going to take over all programming, well, dammit, at least it was going to take over data querying, where it seemed like a clear win. That did have some impact (modern SQL cribs some features and even algorithms almost directly from Datalog), but it was more in the "influence" than "replace" sense.

Today much of the focus is on answer-set programming (http://en.wikipedia.org/wiki/Answer_set_programming), which targets neither large programs (like Prolog) nor large databases (like Datalog), but relatively small programs/databases with complex deduction and constraints (more in the style of solving combinatorial problems). That might be seen as something of a retreat, towards using logic programming for things that were more traditionally done in logic anyway, rather than as a general programming paradigm. There are some logic-programming-in-the-large descendents, like Mercury (http://en.wikipedia.org/wiki/Mercury_(programming_language)), but AFAIK their communities have always been small. It seems they're a little too hybrid to appeal to theorists, a little too academic-sounding to appeal to real-world types, live in Prolog's shadow, and lack a killer app.

One interesting angle is a very recent trend of pushing some declarative-programming ideas directly into mainstream programming languages. LINQ in C# is probably the most interesting one. It's billed as adding something like SQL into C#, but the way it's used as an actual core programming construct (not just a data-retrieval construct), which you can use to write application logic in a declarative rather than either functional or imperative style, has a very logic-programming flavor. This route is, imo, the most promising angle to get anything like logic programming used to build real, large-scale apps: the possibility that a successor to LINQ will add features that move it closer to logic programming seems much larger than the possibility that a Prolog descendent will break through.

[Note of course that the above is all my personal take. I'm a grad student using ASP and Prolog as tools in my thesis, so I clearly think they have merit, though I can also see why, when confronted with "hey we can write this all in Prolog!", the world's answer might have been, "maybe we can just take some of the cool parts of Prolog instead?" If anyone's qualified to answer the "why did Prolog die?" question definitively, it sure ain't me. But do I think that some of the above factors played a role.]


Prolog is not dead, it just has a niche following. Similar to Lisp and Smalltalk, two other languages mentioned in the article. Prolog suffers even more because it requires programmers to think differently -- Lisp and Smalltalk programs can still be written in a procedural style.

Prolog offers at lot of the same things that Lisp also does: symbolic processing, meta-programming, among others. One of the tricks of Lisp programmers is to implement Prolog in Lisp and use it for the areas it is strong. Just read "Paradigms of Artificial Intelligence" to see how this is done.


make (and its many variants) are probably the closest thing to a logic programming language most people have used. Like Prolog, some programmers just don't (and won't) ever learn to think natively in its model of computation. Trying to force make to act like a procedural language leads to convoluted makefiles.


Or the "Reasoned Schemer". Unfortunately, most such implementations are slow compared to Prolog AMs.


Agreed, that's why I think it makes sense to use the real thing. But commercial Lisp compilers have Prolog implementations that I suspect are better than the ones appearing on text books.


Not to nitpick overmuch, but if functional, object-oriented and logic are the other styles of programming language, then surely the correct adjective to describe Fortran is procedural? Object-oriented languages are also generally imperative—the contrasting term is declarative, which purely functional and logic programming languages are (to some extent).

Interestingly, given the conclusions reached in the article, both Sterling and Shapiro's The Art of Prolog and Lloyd's Foundations of Logic Programming mention the Japanese fifth generation project as an example of Prolog's impending breakthrough into the mainstream.


surely the correct adjective to describe Fortran is procedural

No, the original FORTRAN that the author is referring to wasn't procedural. FORTRAN II, a few years later, added subroutines to the language.


There is no real restriction that an object oriented language be imperative and, rule of Demeter aside, I think it's been sort of agreed that always returning a value as a result of any given method invocation makes OO languages much nicer (even if side-effects take place as well). However, procedural implies imperative, I believe.

In any case, I always thought language classifications could get a little fuzzy.


I spent the first few years of my professional life using Prolog to implement a statute law expert system shell. In the end we shifted away from it for performance reasons - it just couldn't compete with C/C++.


I doubt whether this is the actual reason that Prolog was killed off. AFAIK during the 90s there was still research going on with Prolog and parallelization. However, this seems to have been killed off during the end of the 90s because of lack of results.

IMO the thing that killed Prolog is that there has never been a good way to do parallelization without having to massively rewite Prolog programs.


There's still some active research in Parallel Prolog implementation. See this fabulous survey paper of the implementation approaches: http://portal.acm.org/citation.cfm?id=504083.504085 Gupta, G., Pontelli, E., Ali, K. A., Carlsson, M., and Hermenegildo, M. V. 2001. Parallel execution of prolog programs: a survey. ACM Trans. Program. Lang. Syst. 23, 4 (Jul. 2001), 472-602.

Hermenegildo, in particular, continues active research in this area and is quite visible and active in the broad programming languages community.


Interesting read. I'm just aware about the logic class of programming language: imperative (1956, Fortran), functional (1959, Lisp), object-oriented (1972, Smalltalk), logic (1974, Prolog). Some Erlang syntax is borrowed from Prolog, so is Erlang a logic programming language or functional (I always thought Erlang is functional)? What other language is under logic class?


Erlang is a functional language (among other paradigms) but not a logic language. Logic languages are based on relations instead of functions. Datalog and Oz and Mercury are other examples of logic languages.


Erlang is a mixed paradigm language, which borrows from languages like Miranda and Prolog. Initially it was written using Prolog syntax (relations) and VM was implemented in Prolog too. Later it's Prolog syntax was adapted to the current functional style (i.e. functions).


Erlang is the only object-oriented language.


Clearly you're looking for someone to say

Uh - what?!

So I'll do so.


I think there's something to that, though - "Object-Oriented" has been used to mean many things.

Alan Kay's definition, more or less: "OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme late-binding of all things. It can be done in Smalltalk and in LISP. There are possibly other systems in which this is possible, but I'm not aware of them."

Erlang has processes with local state communicating via messages, and can do late-binding with pattern matching and hot code loading. It just disposes of classes and other such baggage.


Ok, but there are plenty of things that aren't message passing in Erlang, whereas, say, in Smalltalk, pretty much everything goes through messages, no?


Sure, I'm just saying fogus is making a real point (even though "only" sounds like trolling).


Yeah, I've heard or read Joe Armstrong talking about how Erlang process are like 'objects', but his brief cryptic comment needed something more. And I still disagree with it, because there's tons of interaction in Erlang that is not message passing.


I'm not saying Erlang is 100% OOP. Not at all. Just the good parts. :)


Have you looked at Reia at all?


I've glanced at it, but a language that sells itself as a Ruby-like language for the Erlang VM doesn't appeal to me. I really like Erlang, but don't care for Ruby. (In the procedural/OO/scripting niche, I strongly prefer Lua.)

I'll check it out again, though. If nothing else, it's an example of a real compiler written in Erlang, which is interesting in itself.


I tried learning prolog. I had a class on it in college and it never really clicked. Then a tutoring gig fell into my lap a few years later for the same class and I ended up getting "it". And even then it wasn't worth any additional exploration. I learned python and pay my bills.


Prolog was dynamically typed and in the 80'es that usually meant slow execution (sans Orbit/T I am told). What made C damn popular was the sheer speed in the produced code. When you have 7-20 Mhz computers a slow language hurts.


Types in prolog could be added as system predicates, and I think some extensions do this. I know there was a version called Turbo Prolog that generated compiled code. So, Prolog can be compiled or not, it is just that interpreting is far easier to do.


There's a de-facto standard bytecode and virtual machine design for byte-compiled Prolog, known as the Warren Abstract Machine. There are links concerning it and derived designs here: http://news.ycombinator.com/item?id=1649138

Also, wamcc (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.45.1...) compiles Prolog to C.


I wonder if, and to what extent, the existence of reasoning engines and frameworks such as Jena has played to the alleged Prolog murder.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: