Hacker News new | past | comments | ask | show | jobs | submit | jethkl's comments login

your comment introduced me to the term permaculture, looks interesting. Do you leverage numerical optimization - linear programming or integer programming for example - in your work?


LPs (and MIPs) are used for scheduling, planning, and logistics. These are some of the earliest applications of LPs, dating to the 1930's and 1940's, and these applications are still relevant. The wikipedia page has a good overview of the history and utility of linear programming. https://en.wikipedia.org/wiki/Linear_programming


Yes, I took a math course on it way back in college but I haven't ever considered using it professionally. Heck in college I did it by hand.

I've never had to say, help design a $100 million server farm. I've had a desire recently to strive to be that level of professional.

My question was more about in the hn world where is this stuff used


That's a clarifying response: I've applied (non)convex programming (LP, QP, MIP, etc) to all the above and a few more, all of which i'd classify as classical applications. Less traditional applications -- I'd like to explore these more --- include data envelopment analysis, which provides a framework for assessing the efficiency of processes based on 1 or more input metrics, and several ideas in papers published at NeurIPS and other conferences that integrate LPs into neural networks in various ways, including AUC maximization. I've also worked on first order methods to solve LPs, and while I'd like to continue in that direction, the area is very crowded with very good existing tools, new and emerging tools that are also often good, and very strong teams building on all of the above. One of the biggest challenges that I see in the OR space is that it requires human expertise to leverage the technology.


In a professional context, I use it to help a client (chemicals company) optimize their deliveries to customers. They have > 100 production sites and thousands of customers, so LP is used to allocate customers to production sites based on the product-cost and availability at supply site and trucking costs. We have evaluated multiple solvers (Gurobi/ Llamasoft/ GAMS/ LocalSolver etc) for optimizing deliveries, as well as evaluating the cost impact of changes to the delivery network.


> An extremely useful law to remember....

Would you be willing to provide an example where you applied Little's Law?


Sure! In a professional capacity - our customers were dissatisfied with the length of time and flakiness of a particular workload job[0], and I was party to the discussions on how to solve this problem. Knowing Little's Law allowed me to dissent against the prevailing opinion that we should customise our job processing queue to prioritise these jobs[1], arguing instead to provision more resources (i.e. beefier servers).

The decision was still made to alter the priority. The change went into production and was found to unacceptably degrade the performance of other jobs. Thankfully, one of the engineers who I had convinced that "processing time is the only factor that matters" had spent all their time optimizing the heavy task, to the point where it was no longer a heavy task, thus saving the day.

0. The job was some kind of "export CSV" form, and it somehow involved both 'traversing dozens of highly normalised tables' and 'digging into json blobs stored as text'.

1. I specifically remember one of the arguments was that if you have 3 heavy tasks A B and C, best case is "in parallel" which takes max(A, B, C) time whereas worst case is "sequential" which takes (A) + (B + A) + (C + B + A) time, our current priority approximated the "sequential" scenario, and the priority change would instead approximate the "parallel" scenario. I use scare quotes because I felt it was a resource issue (the "sequential" pattern was a byproduct of the most common way a heavy task got enough resources... which was an earlier heavy task finished and freed up a lot of resources).



Yes, that sounds a lot like the argument I remember hearing


Thank you! -- and thank you to the others who shared!


what a predictable result. changing to a priority queue in a heavily utilized system will ALWAYS increase wait time for other tasks if independent tasks.

The ONLY time thats not true is if higher priority tasks would eliminate the need for other tasks.


Many electronic queueing systems in hospitals etc. show you how many customers are ahead of you. If you observe the time it takes for a few people to receive service you can estimate how long you'll be stuck waiting. This is a number they rarely show you because it's often not encouraging...

It's the most primitive application but immediately useful for anyone.

I've used similar methods to estimate work lag: https://two-wrongs.com/estimating-work-lag.html

And to argue for concurrency limits in latency-sensitive backend serviced.


Every time I'm in a queue at a store I compute the arrival rate while I wait in line. (Little's Law is one of my very favorite pages in all of Wikipedia!).


Martin Gutmann's recent talk ( https://www.youtube.com/watch?v=b0Z9IpTVfUg ) supports your point. While Shackleton is celebrated for overcoming disaster, Admunsen reached the south pole by avoiding disaster in the first place. Yet despite Admunsen being more effective, he is not celebrated to the same degree (at least in the US) as Shackleton.


"he is not celebrated to the same degree (at least in the US) as Shackleton."

As a Scandinavian I am pretty surprised by this, and I did not know. But I guess drama sells in the US :)


You are missing the joke. The correct line would read:

  isHexadecimal("HADC0FFEE"); // false


The engineering hall statue! Thank you for posting your comment. I thought it was an abstract sculpture trying look "engineery". But it's a realistic representation of a cultural wound, and it is a warning.

https://engineering.wisc.edu/wp-content/uploads/2022/10/Kids...


“No deed of honor is commemorated here”


Four thousand years from now, P-Cygnian explorers arrive to find a desolate Earth-- the evidence of mankind is spread far and wide but no humans remain.

Archeologists are convinced that this planet was home to yet another civilization that killed itself fighting for access to energy ... or at least they were convinced, until they found The Artifact.

Every known civilization inevitably discovers the same design for fusion power or dies before it makes it that far: apparently it's the only one that works.

And there it is on earth, in statue form, at the bottom of an excavation pit recognizable to any student of physics.

What happened here?


This is such a prescient analysis from over 30 years ago, and he delivers so well - extemporaneous on a white board. The only error I see is the timing of when the market opportunity for NeXT would match their product: he predicted 1993. The world got there - pretty much as he predicted - but it took a long time.


It's great to see research in this field, I know there is opportunity here, and I hope to someday benefit from progress. But I skimmed the paper, and it doesn't appear solve a problem that I have. From the practical standpoint, what I want from a time series tool includes: 1) a small set of simple levers that I can review and tune 2) short training time for any input sets of size O(10k) to O(100k) (this covers seconds/day, minutes/week, hours/year) 3) the process of train + forecast can run fine on CPUs -- not GPUs with low memory overhead 4) decent out of the box performance that basically passes the sniff test and 5) a simple way to include regressors. I've enough experience to have learned to be wary of fully automated tuning, benchmark performance metrics, elaborate models, etc.


One amazing feature of Lustron houses is that their enameled exterior has aged amazingly well. I live near several of them, and all of them, without exception, appear to have survived 7 decades in a harsh northern climate. The enamel appears to be colorfast, does not chip or stain, and the material must be nearly indestructible. It isn't obvious from the photos, but the roofs are also enameled, and I believe all the houses near me still have the original roofs. I'm guessing the individual panels are basically impossible to replace or repair. They have survived well in a very tough environment.


I don't know if it's mentioned in the article, but is living in these particularly noisy in rain/wind? How is sound transmission in general within them?


One of the strength of Weber kettles (Charcoal cookers) is that they are enamel, not painted and unless chipped or dented can last a long time.


Better realtime data may improve actionability, and the big audience reach may influence the overall narrative. Independent measurements are extremely important to have for science and data validation. But monitoring methane has been going on for a few years. NASA even has a portal:

https://methane.jpl.nasa.gov

https://www.esa.int/Applications/Observing_the_Earth/Coperni...


Yes, NASA (JPL) did some of the original work on remote-sensing measurements of methane emitters.

The bulk of the methane emission events displayed on the (excellent) portal you linked come from EMIT, an imaging spectrometer on ISS.

EMIT was made to map ground mineral composition for climate studies, but it does gather a visible-to-IR spectrum across the mid-latitudes at a 60m resolution. This spectrum bears the imprint of any atmospheric methane, allowing them to back out a CH4 concentration from the spectrum as a side product separate from the original mineral-mapping goals.

(Some history: https://news.ycombinator.com/item?id=33427748)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: