Hacker News new | past | comments | ask | show | jobs | submit login
Transducers.js: A JavaScript Library for Transformation of Data (jlongster.com)
139 points by jlongster on Sept 18, 2014 | hide | past | favorite | 20 comments



The idea of declaratively describing transformations of data seems very exciting to me. I haven't experienced anything that lives up to the excitement though.

I feel a bit underwhelmed by these examples. Seems a bit to imperative, an enhancement of underscore. I want the declarations of how data representations relate to each other to be separate from the "how" in how to transform from one format to another. Probably something inspired by lenses (1). "Lenses are the XPath of _everything_." (2).

(1) - https://www.fpcomplete.com/school/to-infinity-and-beyond/pic...

(2) - https://twitter.com/steveklabnik/status/463748643141349376

edit: Relevant HN comment about using Lenses for reaching into data objects: https://news.ycombinator.com/item?id=7704504


Just by glancing quickly over it it's difficult to draw big conclusions but, while the concept certainly feels promising I'm not sure about API - transduce require 4 parameters and it feels like you really need to know what happens inside to make use of it (that append function feels especially off). But maybe it's just me not being able to switch to functional way and it's simply a matter of spending some time with it (and do some serious work with it).


It is very functional, so it will feel a little different if you aren't used to it.

You will rarely actually use `transduce` though, just like you probably rarely use `reduce`. You use the other two functions, `into` and `sequence` a lot more, and you don't really have to understand what's going on underneath (even if it is is pretty simple):

    sequence(compose(map(x => x + 1),
                     filter(x => x < 10)
                     take(10)),
             [1, 2, 3]);
It is a little different than standard imperative code, but I think you will find that it's more expressive.


Somebody more knowledgeable correct me if I'm wrong, but the append function being taken out of the reduce logic is intentional. It specifies how the results are put back together (in this case with arrays / Vector, append is used) in a data-structure agnostic way.


Reading that article inspired me to write a little generic recursion library. The idea is that you'd write a 'map' function for each recursive data structure and then generate recursive functions from non-recursive functions using 'induction' and 'coinduction'.

    // induction : ((t (Fix t), Fix t -> r) -> t r, t r -> r) -> Fix t -> r
    function induction(map, f) {
        // g : Fix t -> r
        function g(x) {
            return f(map(x.unfold(), g));
        };
        return g;
    };
    
    // coinduction : ((t s, s -> Fix t) -> t (Fix t), s -> t s) -> s -> Fix t
    function coinduction(map, f) {
        // g : s -> Fix t
        function g(s) {
            return {
    	        unfold: function() {
    	            return map(f(s), g);
                }
            };
        };
        return g;
    };

    // fold : t (Fix t) -> Fix t
    function fold(w) {
        return {
            unfold: function() { return w; }
        };
    };


For some reason (maybe environmental factors), I understood this article much better than Rich Hickeys initial post. However, I suppose I should now go back and re-read it.

Thanks for the great library and explanation.



Practical examples go a long, long way when showcasing libraries such as this.


I think the name "transducers" is a little unfortunate only in that it sounds very foreign to most coders, leading the concept to potentially be ignored. Don't let the name "transducers" or the relation to Clojure scare you off; you don't have to know either thing to understand how transducers can be useful.


I really hope that just because something sounds foreign programmers aren't willing to at least give it a passing glance. New frameworks are going to do things in radically difference ways and we need to be willing to learn.

That said, I easily get turned off by something if it isn't obvious within 10 minutes how it works. This really isn't that complex. I think people who read my post can overcome the weird word once they see how simple it is.

There's a large group of people where it helps to have the clojure connection; I'll keep posting practical examples of how it works and simply teaching. A name is not really that big of a deal.


Agreed, and kudos for writing so much, so well, about these new paradigms.


The name is not at all unfortunate.

But pandering to anti-intellectualism is extremely unfortunate. Irrational fear of new terminology is a major stumbling block to progress. There is a wealth of new things to discover that will require new modes of thinking, new concepts, and new terminology.

If devs wish to remain stagnant in their knowledge and their practice, that's fine. But in doing so, there's no reason why they should offer their opinions on matters they've chosen to ignore. I hope they'd have the intellectual humility to simply listen, rather than contribute noise to the discussion.


To anyone with hardware design experience transducers are hardware sensors. That meaning has been around for >60 years now.

So a lot of people are going to see 'transducers.js' and think this is a library for interfacing I2S or 1-wire sensors to Node, or something.

That's not anti-intellectualism, it's pointing out that the word already has an established meaning among engineers, and trying to redefine it is going to mislead a lot of people about what this library is trying to do.


I'd hope that the same community that takes "concatenation" and "list comprehension" and "mutability" to be household words would be able to get over some jargon weirdness so long as it's explained adequately.


I'd also add homoiconicity and referential transparency to those :P


Don't forget idempotent!


isomorphic -- the easiest of the bunch but no less foreign


I find the name "reduce" unfortunate, especially since the classic example,

    (defn sum [xs] (reduce + xs 0))
(or whatever) only reinforces the naive assumption that it makes the thing smaller. It was only after playing with dozens of problems on 4clojure (a great site), that I grokked how reduce can make the input much bigger, or anything you want.

So in that respect, I think "transduce" better fits the concept of transformation, which is what this is all about.


actually i think it is unfortunate for a completely different reason: transducers, particularly finite-state transducers, already have a long and rich history in both applied and theoretical computer science; and are currently one of the fundamental building blocks of most NLP and ASR systems in industry and academia.

see http://www.openfst.org and many others.

tacking on some random new etymology, 'transducer is just a combination of transform and reduce', is going to lead to all sorts of unnecessary confusion, made worse by the inherent similarities...


I got excited to read this because of the name since earlier today I decided to dig into the general field of signal processing[1] and the first thing they talk about is transducers. With 2 minutes of transducer knowledge under my belt I think I'm fully qualified to say it seems a fitting name for this library.

[1]http://en.wikipedia.org/wiki/Signal_processing




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: