Hacker News new | past | comments | ask | show | jobs | submit | cosmic_quanta's comments login

There's no time like the present. Feel free to reach out if I can help you along your journey

Not the person you're replying to, but I'll bite:

I've written low thousands of lines of Haskell. Similar to mikojan, I love Haskell in theory, but ended up not enjoying it as much in practice.

1. The multitude of string-y types. I end up converting between String, Text, Lazy Text, ByteString, Lazy ByteString, and I forget what else. Each library wants me to pass in a specific string type, and each other library returns a different string type. LLMs are good at this, also for a while I had a ton of helper functions to convert between any two string types. Still, perhaps there's a better way?

2. The error messages. I come from Elm, so I'm spoiled. Yes, I understand a language with HKTs will never have as nice error messages. Yes, LLMs are pretty good at explaining GHC error messages.

3. The stdlib. Haskell gets a lot of credit for safety, but a `head` blows up instead of returning a `Maybe`. I know there are other - safer - preludes, but I don't know how to choose between them. I don't know how using a different prelude would impact my projects.

I feel like my next step is either towards Idris, with its polished standard library (and dependent types baked into the language!), or towards something simpler and more Elm-like (Gleam perhaps, or Roc). But if you can sell me on Haskell, I'm all ears!


I'm not going to sell you on anything. All of the things you've mentioned are true. Loosely, the multitude of string types and the state of the standard library come from the same place: the language is 30+ years old! There are many warts to be found.

However, if you decide to start learning, the path is hard, especially if you come from a non-computer-science background like me. I attempted to learn Haskell twice; I bounced off the first time, quite hard, and didn't try again for years.

What worked for me is a combination of two things:

* Having a goal in mind, that has nothing with the choice of language. For me, it was building a personal website

* The book Haskell Programming from First Principles [0]

and if you have more questions, reach out.

[0]: https://haskellbook.com/


> Having a goal in mind, that has nothing with the choice of language

Yes, yes, that's exactly what my encounters with Haskell looked like. The last one is ~1k lines of code backend for a personal project. I feel that's about as much as I could manage at this point.

> The book Haskell Programming from First Principles

That book is getting recommended all the time! I'm concerned if it's perhaps a little too basic for me. (I understand monads, monad transformers, have some notion of final tagless and free monad. Yet I get perpetually confused by various relatively simple things.)

I guess what I'm missing is haskell-language-server to help me a little. Here I'm confused about the interplay between `haskell-stack` (which is in Debian repos and which I think I'd like to use), ghcup, cabal, and haskell-language-server.


Hence GHC extensions? Overloaded Strings don’t help? It’s been about 20 years since I wrote Haskell in production.

My favourite thing about Haskell concurrency is that there are no colored functions [0]. Writing code in IO, or Async, or the next big thing (asychronous higher-order effect system of the future??), doesn't require language support like Python or Rust.

The one construct that unlocks this lack of colored functions, STM, did require runtime support (as opposed to language support), which at least is transparent to downstream developers

[0]: https://journal.stuffwithstuff.com/2015/02/01/what-color-is-...


Coloured functions are a feature - not a bug, Haskell is full of them, and they are exactly what make STM safe in Haskell, but abandonware in other languages which have tried.

  2. The way you call a function depends on its color.
`<-` or `>>=` vs `=`

  3. You can only call a red function from within another red function.
This should sound pretty familiar! You can only call an IO function from within another IO function. STM in this case makes a third colour:

  IO can call IO functions.
  IO can call STM functions. (*)
  IO can call pure functions.

  STM can call STM functions.
  STM can call pure functions.

  pure functions can call pure functions.
(*) calling into an STM block from IO is what makes it 'happen for real': it's the `atomically` which has type STM a -> IO a.

Having these coloured functions is what made STM achievable back in the mid-late 2000s, since the mechanism to prevent STM or pure functions from calling IO was already in-place.

Other languages either tried to figure out how to contain the side-effects and gave up, or just released STM and put the onus on the user not to use side effects.


It is a shame that the people you are answering is being downvoted, I also understand the importance of coloring functions, but look at the examples that person put, python and rust. In those, executing a colored function (at least the async related ones) propagates up to the top of the program, that is a cost that we have to interiorize, but I would be lying if I told you I wouldn't he happy with such behavior. I do a lot of js/ts and I would love to just be able to "inline" await without making my current scope recursively to the top of the program like it can be done with F# with the Async.StartAsTask operation.

This is also an advantage of blocking code. It’s just regular code. The async stuff is handled by the operating system.

> sounds like it would be hard to implement a web server handling 10k+ concurrent requests on commodity hardware?

In practice, it is not. The canonical Haskell compiler, GHC, is excellent at transforming operations on immutable data, as Haskell programs are written, into efficient mutations, at the runtime level. Also, since web development is quite popular in the Haskell community, lots of people have spent many hours optimizing this precise use-case.

In my experience, the real downside is that compilation times are a bit long -- the compiler is doing a LOT of work after all.


> The canonical Haskell compiler, GHC, is excellent at transforming operations on immutable data, as Haskell programs are written, into efficient mutations, at the runtime level.

Yes, at the level of native machine code and memory cells, there's not that much of a difference between immutability + garbage collection, and higher level source code that mutates. Thanks to GC you are going to overwrite the same memory locations over and over again, too.


Programmers for some reason really don't understand that generational garbage collection provides locality. I am really surprised how often I see C/C++/Rust types not understand this.

I think that only applies to a moving GC. A conservative GC (like the Boehm GC for C) doesn't move any items around, and thus doesn't do anything for locality.

Of course, even a moving GC has limits, itwon't turn a hashtable into something that has local accesses.


The author is thinking of updating the book to a second edition as well. Looking forward to it

noice

> (...) the apparent legitimacy is enhanced by the fact that I used a complicated computer program to make the fit. I understand this is the same process by which the top quark was discovered.

This is both hilarious and more common than you might think. In my field of expertise (ultrafast condensed matter physics), lots of noisy garbage was rationalized through "curve-fitting", without presenting the (I assume horrifyingly skewed) residuals, or any other goodness-of-fit test.


Electricity markets are completely inelastic in the absence of storage ability. Negative prices are an indication that the grid needs the ability to absorb surpluses from sources with effectively free fuel (solar, wind).

Note that "absorbing surpluses" does NOT require energy storage in the form of batteries, which is expensive and not necessarily green. Another option is grid-interactive buildings, that can harness energy surpluses in near-real-time when they arise [0]. Hopefully we seem more of these buildings.

[0]: https://edoenergy.com/


We are going to see a toooooon of battery storage too. Building HVAC is great, water heaters are great, because thermal storage is easy and present in all of existing infrastructure.

I hope that we can have enough vehicle chargers at workplaces that help absorb excess supply too.

It won't be long until during seasonal peaks we will have multiples of demand available to be dispatched on the grid, during the sunny hours.

I've been trying to think of applications that will benefit from this coming future for about a decade, but have not yet hit upon the sort of application where capex is low enough that this sort of big swing is easy to use.


Global BESS deployments soared 53% in 2024 - https://www.energy-storage.news/global-bess-deployments-soar... - January 14, 2025 ("Storage installations in 2024 beat expectations with 205GWh installed globally, a staggering y-o-y increase of 53%. The grid market has once again been the driver of growth, with more than 160GWh deployed globally, of which 98% was lithium-ion.")

China’s Batteries Are Now Cheap Enough to Power Huge Shifts - https://www.bloomberg.com/news/newsletters/2024-07-09/china-... | https://archive.today/DklaA - July 9, 2024

China Already Makes as Many Batteries as the Entire World Wants - https://www.bloomberg.com/news/newsletters/2024-04-12/china-... | https://archive.today/8Dy4D - April 12, 2024


The WWPG is the answer! World Wide Power Grid...


Its not a market that in not elastic. It is physics, you can't just close the pipe and electricity stops flowing. Deal with this. There are ways to handle the problem, like pumped storage power plants, but they require very particular terrain (a mountain next to the big lake). "grid-interactive buildings" are fun, but this is not the scale that can make any substantial difference. So far we do not have effective, long-term energy storage and no amount of hand waving is going to change that. So, let's be realistic and build nuclear power plants.


> you can't just close the pipe and electricity stops flowing

You can if it's solar panels! You can just turn the inverter off! The surplus is not in itself a problem, only the dark winter months.


This.

There will be spikes for demand and supply, and the grid is a real time market. There are already spikes and drops in usage as humans wake and sleep.


> It is physics, you can't just close the pipe and electricity stops flowing.

Rooftop solar power plants are physically able to stop producing, this requires some firmware changes so that they stop putting power into grid when someone orders them to. But there is no market or political will for such solution.


> you can't just close the pipe and electricity stops flowing

We can. Solar panels can be disconnected. Wind turbines can be stopped. Dams can be stopped and dump water without energy production.


Some dams can store water in reservoirs. Actually, if there is a surplus of energy, some dams can use it pump water up from their lower to upper reservoir (a reservoir is effectively a less efficient battery).


Indeed, the inelasticity is because of the physics. Nonetheless, common terminology in the industry is that the market is inelastic.

Regarding nuclear power: it is a great technology for the base load (edit: I mean firm power), but there's always going to be fluctuations in consumption, which needs to be met with fluctuations in generation. Grid interactive buildings can help mitigate this fluctuation.


Really it's the other way round: all the generators have safety trips and can disappear from the grid in a short time if required, but it's uneconomic to do so. Which is why you get things like wind farm curtailment payouts; they're part of the weird set of compromises between spot market and long-term (necessarily central!) capacity planning.


Due to how much renewables many western grids today have the traditional ”baseload” is effectively zero nowadays.

As shown by nuclear plants bidding negative so they don’t have to shut down.


> the traditional ”baseload” is effectively zero nowadays

Baseload is about the electrical demand, not the generation. The minimum demand across a period of time.


Exactly. And then OP goes and talks about ”baseload nuclear” like they can force us consumers to buy horrifically expensive nuclear power when renewables deliver said ”baseload”.

Which is why I said ”traditional baseload”. In other words: our need for 100% uptime coal or nuclear plants.


Right I get it backwards all the time. I meant firm power


Build nuclear like France, one of the four countries named in the article as having negative prices?

The nuclear that random posters here believe in is magical and mysterious and not at all related to the nuclear plants that exist in the real world.


> It is physics, you can't just close the pipe and electricity stops flowing. Deal with this.

This is so wrong I can't resist ... I turn off the electricity pipe multiple times a day with a light switch, and as far as I can tell in my 6 decades of existence the electricity has stopped flowing every time with no ill effects.

What are are perhaps trying to say is you can't just shut of some sorts of power generation - like coal and nuclear. But as another poster pointed out, that's a limitation of those technologies. Wind, solar, gas and even diesel generators can be turned of near instantaneously.


For some figures on demand reponse Kraken energy manage devices across Europe and the US:

"And today, to give you the exact numbers, we manage close to 400,000 devices in real time. That's about 1.6 gigawatt of power that can be turned up or down at any moment in time and space. And that's where consumer devices become really powerful." [1]

1. https://www.volts.wtf/p/making-sure-smart-devices-can-talk


While true I suspect that the future of energy storage is overwhelmingly dominated by large scale battery storage. A lot of the alternatives fall in the "cute, but complicated, situational, and/or doesn't scale" category.


It depends on what you mean by future. The next couple of years certainly. But once you start looking for technology that can store weeks worth of energy over several months, batteries don’t look like the clear winner.


Depends how much need there is for long term energy storage.

Off grid users have this problem currently when building solar+battery systems. They can pretty reasonably afford to install enough battery to cover a couple of cloudy days in winter, but if you think about two weeks of solid cloud cover the battery planner goes nuts. The solution is to install a sane amount of battery, enough for 2 to 3 cloudy days, and a small backup fuel generator for those few days a year when the solar falls short.

Does this solution scale to the grid? It involves a fuel plant that sits idle for the vast majority of the year which is not great for paying off its construction costs. It would almost certainly have to be subsidized by the ratepayers, but since they're otherwise getting almost free solar power maybe this can still work. It's also bad news for the fossil fuel extraction industry. There isn't much need to drill baby drill when you're only burning fuel for like two weeks out of the year.


pumped water storage and compressed gas are also great


Best of luck (not being sarcastic). Hacker News celebrates the risk-taking of startups for a reason -- great leaps often requires taking a risky jump. What you're doing is similar.

Please post about your experience! I'l love to read what happened


Thanks! It's hard to describe, but ultimately I realized that neither academia nor industry is willing to invest in moonshot type ideas. Moreover, academics have to worry about funding and tenure (in addition to teaching and service). Funding is easiest when you propose research topics that people believe stand a chance of success. For wacky ideas it's difficult to make that case.

Anyway, I have enough money to fund myself for at least a couple years, so my goal is to make the best of that time and see if I can upend the dominant paradigm in NLP (humans don't need as much data to learn — I'm going to pursue ideas that allow computers to be as data efficient).


Chicago remains the hub for commodities


While tariffs are on everyone's mind, the key reason is:

> Amazon critics are also calling for a boycott following the closure of seven warehouses in Quebec

Amazon warehouses in the provice had started unionizing. Incredible coincidence.


I created a Pandoc filter to render some code blocks into figures for my PhD thesis. This is nice because one does not have to manage text and figures separately.

https://github.com/LaurentRDC/pandoc-plot


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: