Hacker News new | past | comments | ask | show | jobs | submit | more alicemaz's comments login

have you considered offering it as a cloud platform? we're doing something along these lines, niche scientific software (biological modeling, bioinformatics) as a paid hosted service. still at the prototype stage! so I can't comment on how well the business model actually works yet lol

but the idea is our mathematician will be able to publish whatever novel math she develops, and we may eventually open source the math core as a reference impl, but we'll keep all the cluster management and other supporting infrastructure code proprietary. sort of a "if you want to run it on your desktop, go ahead! if you want to actually scale this up for big jobs, we've done all the legwork already so it's really in your best interests to just pay us." I think open source ideals are good and worthy but from a business perspective, you capture value by providing value that can't be got without you. relying on customer goodwill is particularly difficult because any large org, the people who will feel goodwill toward you and the people who can authorize purchases are in two different departments

also fwiw I think if you wanted to do the model you described in the paper unchanged, gpl is a much better choice than mit. copyleft actually serves as a wonderful poison pill: you can try us out for free, but if you want to ship us, you need to pay for a proprietary license or legal will nail you to the wall. whereas mit, there's no stick. I've seen affero used by several projects for this express purpose: you have to buy a proprietary license because agpl is so onerous you just can't use the code for commercial purposes at all

interesting project btw, I love seeing stuff like this!


Thanks!

Yes, I've considered cloud platform. There are several big difficulties with that.

First, data. It's easy to grab public data from PubChem, ChEMBL, and a few other projects, and make a service. But why would anyone pay for it given that PubChem, ChEMBL, ChemSpider, and others already provide free search services of that data?

There's search-as-improved-sales, like how Sigma-Aldrich lets people do a substructure search to find chemicals available for sale.

There's value-add data. eMolecules includes data from multiple vendors, to help those who want to purchase compounds more cheaply.

Or there's ZINC, which already provides search for their data.

So you can see there's plenty of competition for no-cost search. I don't have the ability to add significantly new abilities that people are willing to pay for.

Note also there's a non-trivial maintenance cost to keep the data sets up-to-date.

Second, the queries themselves may be proprietary. I talked with one of the eMolecules people. Pharmaceutical companies will block network access to a public services to reduce the temptation of internal users to do a query using a potential $1 billion molecular structure (or potential $0 structure). eMolecules instead has NDAs with many pharmas which legal bind them. Managing these negotiations takes experience I don't have, and neither do I have the right contacts at those pharmas.

Sequences don't have quite the same connection between sequence and profit as molecules do.

BTW, part of the conclusion of my work is that people don't need a cluster for search - they can handle nearly all data sets on their laptop, so there shouldn't be a need to scale up any more. And small molecule data has a much smaller growth curve than sequence data, so Moore's Law is keeping up.

My first customer, who continues to be a customer, said outright that they would not buy if it were under GPL.

Since my paying customers are pharmaceutical companies who, as a near-rule, don't redistribute software, it doesn't really matter if they don't redistribute under MIT or don't redistribute under GPL.

I came into the project in part to see if FOSS could be self-supporting on it own. AGPL is often used as a stick to try to get people to use a commercial license - the implicit view of the two-license model is that FOSS is not sustainable. Which is now my conclusion, for this project and field.


not really into industry, but a) the pharma-companies using it are probably reluctant to give you their data and b) uni researchers are not overly fond of high-fee services and labor is cheap there.


honestly I think that's an artifact of how often he is imitated. having read more than my share of seed fund about pages in the past few months, there is particular flavor you encounter over and over again that can only be described like "ah yes, I see you also have read zero to one." heavy emphasis on the transformative over the incremental, hard tech, disinterest in pedigree and traction given a shamanistic insistance that they are uniquely capable of identifying the visionaries of tomorrow, constant reminders they go against the grain and buck consensus, punchy copy talking about bold iconoclasts making the future of tomorrow yadda yadda

the three big aesthetic/intellectual strains of sv thought as I see them are truetype protagonist gleaming tech optimism a la early pmarca blog, folksy practicality a la pg, and dark horse contrarianism a la thiel. leonardo, donatello, raphael (turtles, not painters). but he did originate the kernel of it and can't really be faulted for the imitators

what actually makes him interesting tho is he's a disciple of rene girard, who argued among other things that the basis of socialization and social conflict is mimesis, which goes a long way to explaining his fixation on avoidance of imitation (and makes the raft of imitators that much funnier imo)


the way my and my partners look at this is an infrastructure problem. our hope anyway is we can build tools and methods (better mathematical modeling, better bioinformatics software, faster/more accurate sequencing, eventually standardized gear for continuous process chemistry) that can serve startups of a bioengineering bent while also making it more practical for independent researchers to work outside academia. the biologist of our team is particularly interested in longevity research herself so partly we're trying to make the things that she'd need to do her own research 5-10 years down the line

I'm pessimistic about institutions but optimistic about people. I think the academy is an impediment to a lot of interesting and necessary projects, not just here but in general. risk-averse, prestige-obsessed, heavily bureaucratized. most of the institutions in our society are like this, hollowed out. there's a lot of basic research that needs to be done, so I think the highest leverage thing we can do is knock down barriers to more people being able to do it. route around the institutions


Hello Alice, off-topic but it's great to see you here.

I keep a binder at my desk with printed articles from the web that I find interesting enough to want to keep forever (and give to my children one day). In there is your take on 'playing to win' on the minecraft economy. Thanks for that.


hi! thanks for the kind words aha, always nice to hear people appreciate what I put out. trying to get a bit better than my current average of one post per year lol


>It's about social justice and corporate citizenship. Viewpoint heterogenity and other arguments of private benefit have always been secondary arguments.

it's more akin to a sales tactic

compare the shift in preeminence from "free software" to "open source." the pioneers make a moral/ideological argument why doing such-and-such thing is an imperative, get visibility and carve out a niche, but don't achieve mass adoption. bit later, the strivers show up and argue such-and-such actually just makes good business sense, and this proves interesting and palatable enough to get fully institutionalized

or, say, the shift from "gay liberation" to "gay rights." the argument in the 60s and 70s was that we're inherently subversive entities whose lifestyles by their very nature go against the status quo. by the 00s the dominant line is if only we had these and those legal rights, the final obstacles to our complete willing assimilation into the status quo!

as "diversity" continues its transition from ideal championed by believers to administrative function served by a professional class, it will continue to mutate in a way that benefits those who implement it


if they reintroduce user streams I'll be pleased, if they ease off the per-app user limit I'll be elated. it doesn't really mean anything if they promise such and such policies or initiatives otherwise we'd all be holding our breath for bluesky. what's important is what they actually end up doing, which I guess we'll find out next week


it's packaged for void, though I'm not sure what that entails. will say it works well enough I didn't find out about the snap mess until I tried to install it on another distro later


Debian has stricter rules regarding packaging and vendoring dependencies. Void and Arch packages LXD in a similar fashion by separating all the C dependencies. However, none of us actually separate out the different Go dependencies, they are all vendored inside the LXD package.

Debian separates out all of these into own packages.


>I suspect lots of people go through the entire education system like this.

+1. it took me a couple years after getting kicked out of college to get my head sorted out to the point where I felt like I could "think" again

I think one of the most harmful things about schooling is the way it imposes a tracked structure on learning. it demarcates knowledge into discrete subjects and sets up a linear progression through them and says you need to master each step on the track before moving onto the next one. this is poisonous and borderline evil, and I've encountered many people who are crippled for life by it. a lot of people never pursue things they're really interested in and could become extremely passionate about because school has convinced them they need to stack up prerequisite knowledge before they're even allowed to touch it


Our school is poisonous (can tell for France), if not evil. It become crystal clear after reading Celine Alvarez. Not sure if she got translated yet. In english, but older you also have Alfie Kohn, but I haven't read him.

When reading Celine, one understand that children are natural born learner, and there is no effort needed to make them learn stuff. Our school model is industrial production of objects. Thinking human machines. We are way more than that. Sadly Pink Floyd description of the school still echo to our modern school. Some peoples don't feel that way about school. I don't really know why. Maybe they never imagined how better it could have been, so they found it great.


Thank you for the recommendation. Celine Alvarez does have a book translated to English:

The Natural Laws of Children: Why Children Thrive When We Understand How Their Brains Are Wired

> A powerful, neuroscience-based approach to revolutionize early childhood learning through natural creativity, strong human connections, spontaneous free play, and more.

She has a home page, with an English version of some of her articles.

https://www.celinealvarez.org/en/our-approach


Check out Célestin Freinet too (also untranslated in English AFAIK)


Reference appreciated - there is one translated book:

Freinet, C.: Education through work: a model for child centered learning; translated by John Sivell. Lewiston: Edwin Mellen Press, 1993. ISBN 0-7734-9303-4


> says you need to master each step on the track before moving onto the next one. this is poisonous and borderline evil

What’s wrong with it? You do need to understand calculus before classical mechanics, classical mechanics before quantum mechanics, quantum mechanics before quantum field theory, and quantum field theory before the Standard Model. I’ve seen tons of people disregard this and the result is always confused word salad. People waste years of their lives this way, going in circles without ever carefully building their understanding from the ground up. The order in school was chosen for a reason.


I suspect you and GP are talking about slightly different things. GP is probably more opposed to artificial compartmentalizing of things. As an example:

> You do need to understand calculus before classical mechanics

Yes, but how much calculus? Do you need all of Calc I, II and III before even attempting classical mechanics? And should calculus even be treated independently of classical mechanics?

There are various traditions when it comes to teaching these subjects, and the tradition in the US involves keeping a strict distinction between these things, in addition to a "theory first" approach. Other people have studied things in a different manner. Some of my physics professors from the UK had studied most of the math they knew only as needed when they would get to relevant topics in physics - including differential equations, all of analysis (complex or real), some of Calc III, etc.

Even amongst mathematicians, it was common in parts of Eastern Europe to focus on a problem, and learn whatever theory is needed to solve that problem. They didn't learn theory and apply to problems - they took a problem, and learned whatever theory is needed to solve them. I recall picking up a Kolmogorov textbook on analysis and being surprised by seeing this approach, along with the informality with which everything is discussed.

And just a minor quibble:

> classical mechanics before quantum mechanics,

You don't really need to know much except the basics. I think the classical mechanics we covered in our typical "Engineering Physics" courses was sufficient to dive into proper quantum mechanics. It's nice to have been exposed to Hamiltonians in classical physics prior to taking QM, but really not needed. There's a reason neither schools I attended made the classical mechanics courses as prereqs to QM. In fact, I would argue we should split things up a bit: Have a course to teach the very basics of energy, momentum, etc. Then make it a prerequisite to both classical mechanics and quantum mechanics.


>And should calculus even be treated independently of classical mechanics?

Thank you. The value of calculus didn't really click for me until the next year in physics, at which point I wished I had paid more attention.

It's really unfortunate that you first have to drill high-speed high-precision symbol manipulation, devoid of any meaning, for so long before getting any indication of why you'd want to be able to do it.

I like Lockhart's analogy: mastering musical notation, transposition, etc. before being allowed to play your first note.

https://www.maa.org/external_archive/devlin/LockhartsLament....


Some people like to work their way in reverse. I'd often pick a really complicated subject I'm after like "stellar fusion" and then work my way downwards and learn whatever I need to learn in order to understand it. If I had to start from differential mathematics, without knowing why I need it, I'd probably give up.


Are there any notable teachers who take or have taken this approach?

I can't think of any from my education, everything was bottom up building. But when I self-teach (as it naturally emerges from pursuing projects related to work or hobbies) it's often top down digging.


This is the big problem I had with engineering in school, so much was based on faith that you needed it. And just now as I'm writing this, I realized that's completely counter to my personality.

side note: I remember Einstein needed to learn particular math skills to help prove his ideas and sought out to learn it.


It didn't work for the GP. That makes it poisonous and evil.

I've seen this sentiment way too much on HN. X didn't work for me, therefore X is a scam, its perpetrators are evil sociopaths, and if it worked for you you're a cog in the machine, man.


I'm a believer in individual learning styles, and hard as it is for me to understand personally, I do recognize that lots of people may learn best by lecture and drill. I certainly don't think bad of them for it. but many more people are not suited for this, and forcing them through the education system as it exists does them great harm, and I empathize strongly with them


Lecture and drill has been out of vogue for decades. The hot thing is active learning and high-impact practices. And frankly it has the opposite problem of dragging the class down to the slowest common denominator, wasting the opportunity to teach more to the high-achievers.


calculus is required to understand classical mechanics, but intuitions about classical mechanics can inform and accelerate how you learn calculus. I actually tutored calc 1 a little bit in college and most of the time the people I was helping could do the calculations asked of them fine, but felt lost and confused because they didn't know what the output actually meant. everyone learns linear algebra before abstract algebra, but even a little background in the latter creates many opportunities to think "ohhh, this is just like X!" that make the former easier to pick up--and when you revist the latter, you'll probably understand it better too because of the connections you made

is it better to learn python or java to build a cursory understanding of programming, and then c and x86 to see what "really" happens "under the hood?" or the other way around, starting from base operations and then layering abstractions on top? I don't think one is strictly better than the other. when I first learned sql, joins were intuitively obvious to me but bewildered the person I was learning with, despite the fact that he'd been in IT for years and I barely knew how to program, because I happened to know set theory. I wonder what it would be like to do the traditional algorithms and data structures stuff before ever learning to program. it might make picking up a lot of details easier! I wonder why we don't teach children logic gates and binary arithmetic before they learn decimal. it's actually simpler!

in general I don't think there's a good reason why you must teach raw technique before practical application, or a concrete special case before a broader abstraction. even an impefect understanding of something "higher" can give you hooks to attach information about something "lower." familiarity and comfort and an understanding of interrelationships between knowledge is so much more valuable than perfectly executing each step on a given track

when I want to get acquainted with a new field, I usually start by reading current research, barely understand any of it, and then work backwards through the author's previous papers and the papers he cites habitually. my cursory grasp on the latest version makes it easier to understand earlier sketchier versions, and at the same time the development history of the older shines light on the newer. sure you can always "start with the greeks" as it were and work your way up, but I don't think this is objectively better than going the opposite direction

really I think of knowledge as more a highly interconnected graph than a series of linear tracks. as long as you can anchor something somewhere it's valuable to pick it up. it can reinforce other things near it and you can strengthen your understanding of it over time. and getting into the practice of slotting arbitrary things together like this is good training for drawing novel inferences between seemingly disparate topics, which I think is the most valuable form of insight there is


While I completely agree with your general comments, I still want to emphasize how hard this ideal is to implement in practice.

Yes, every subject is linked to every other subject. If I'm tutoring introductory physics, I will use examples and analogies from math, computer science, engineering, biology, or more advanced physics, depending on the background and taste of the student, and it works fantastically. But if I'm lecturing to a crowd, this is impossible, because the students will differ. If I draw a link, some people will think it's enlightening, some will think it's totally irrelevant, some will think it's boring, some will think it's so obvious it's not worth saying, and most will just get more confused.

The same thing goes for the top-down "working backwards from cool results" approach; it's supposed to bottom out in something you know, but whenever you teach multiple people at once, everybody knows different things. The bottom-up linear approach is useful because it gives you a guarantee that you can draw a link. If I'm teaching quantum mechanics I expect to be able to lean on intuition from classical mechanics and linear algebra. If I didn't know the students had that, I would draw a lot fewer links, not more.

Similarly, "if people learned X in school, then Y would be easier to understand later" is true for almost any values of X and Y, because of the interconnectedness of knowledge. But if you ask any math teacher, they'll tell you the school curriculum is already bursting at the seams. You can't just add logic and set theory to existing school math without taking something out. In the 70s we tried taking out ordinary arithmetic to make room for that. It was called New Math, and everybody hated it.


>stack up prerequisite knowledge before they're even allowed to touch it

The way you phrased this reminded me of A Mathematician's Lament by Paul Lockhart[0]

[0] https://www.maa.org/external_archive/devlin/LockhartsLament....


seems like a lot of people are imagining a caricature of what school actually is and blaming it for problems they would have had anyway.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: