Hacker News new | past | comments | ask | show | jobs | submit login

That readme is a massive intro that doesn't explain very much at all.

tldr; Define a model in xml, and a provide it a template script and it'll output some text.

Sound like XSLT? yep. With a custom DSL taped on the side for scripting.

Also...

    Template-driven code generators that use symbolic insertion to inject 
    meta- data into a template. This technology makes it much easier to 
    write new ad-hoc templates. Typical examples are any technology that 
    produces dynamic web page.

    Scripted code generators that use a high-level language to manipulate 
    meta- data and then inject it into templates. This technology makes it 
    much easier to write new ad-hoc code generators. Typical examples are 
    XSLT, GSL and some other scripted code generation languages.
These are just arbitrary terms you're inventing.

Lots of template engines support the ability to manipulate the data before injecting it into a template.

Having a special limited little DSL for scripting that is an antipattern, not a good thing.

Lots of people use translation tools (coffee script, typescript, babel, sass, etc) that map from one language to another 'similar' language, even code, and it works well.

I think we can comfortably say that these days, people are pretty happy with the idea of code generation; ...but I think few people will find anything in GSL that is not covered elsewhere, better.

Can you imagine modeling your typescript in XML so you can generate some javascript from it? ugh.

Perhaps there's some use for this sort of stuff in protocol serialization, but you have to do so much of the heavy lifting yourself for it, I'm not sure why you'd bother over an existing solution like protocol buffers.




See my links in main reply to OP to get the big picture. Far as XSLT, did you meant the 1985-1995 work he did was "just a 1998 technology?" Or the other way around? ;)

http://download.imatix.com/mop/introduction.html

https://news.ycombinator.com/item?id=11558465

Note: I used XML and XSLT back then when they first came out. I had to write custom tech in Perl, etc to get them to work and speed was not in the description. While they were garbage, his methods produced Xitami web server that kicked all kinds of butt. Too bad he switched to XML-based tech as that was my only gripe with it.


Look, this can be hard to grasp. In zproject we write models of our library APIs. From these we get stuff as exotic as bindings in Java, python, Ruby, node, and all scripts to build them.


How does the system automatically manage differences between various languages when binding?


Think of it as as generic templating library/system. It has no knowledge of the output, or the differences. The benefit is that you write one "unified" say model, or some sort of input. And then write the translation to different languages (if you want, it can just be one), and have it generate for you automatically based on the rules you coded into the translation logic.


You write your own backends. These can be trivial to very comolex. See zproto or zproject for examples.


Thats really the key question here.

As I understand it, you do it by hand.


"I'm not sure why you'd bother over an existing solution like protocol buffers."

For context, protocol buffers did not exist when GSL was devised in 1995 (3 years before google was founded). So in fact GSL _was_ the existing solution when google came up with protocol buffers.

Given GSL is still going strong 21 years after inception is a pretty solid testament to its utility.


protobufs generates code for one model. GSL lets you build infinite generators for infinite models. Its like comparing an apple with a fruit farm.


They exist now.

No one is denying this is a tool, a useful one at that... Im just not sure why you use it now, given the alternatives that do now exist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: