James Trunk, who is now Griffin's VP of Engineering, gave one of the most clear and enjoyable technical introductions to Clojure I've seen. Recommended!
> The problem right now is that there’s no way to control the behavior of the underlying Java threading libraries, nio, and writing to disk.
Nor will there ever be, just by the nature of those systems.
You can't get deterministic execution from a system that uses non-deterministic request ordering, task scheduling, etc. -- which is what you get if you always use multiple OS threads, or spawn multiple discrete processes in tests, and so on.
I mean, I guess you _can_ make that stuff deterministic, but it would require you to build synchronization points into every transition in your application state machines, in a way that would be controllable by tests, which is a very tall order!
In practice, the only way to do this kind of thing is to design the core of your system to be fully synchronous, and to make it concurrent by adding concurrency at higher layers, expressed at run-time.
It is a tall order, but but it's doable. The most important thing is to reduce the surface area of your application. Our business logic is almost entirely pure. The 'procs' have no side effects, except for things that happen on the other side of a clojure protocol (java interface). That means all side effects can be stubbed out during tests.
Our "user" code has no access to threading libraries. Threading happens in "kernel" code.
never say never, i think maybe can be done with modifications to missionary, a structured concurrency dsl implementing process supervision for clojure/script. we already instrument missionary flows for testing missionary itself (asserting its state transitions)
I guess you're suggesting that you can build an abstraction layer (I guess you're calling it missionary) which (a) can reliably intermediate every interaction between a user program and any potentially non-deterministic API, and (b) can be configured at runtime as synchronous/serial/deterministic or async/concurrent/non-deterministic. Is this accurate? (If so, I'm pretty confident that this is infeasible.)
I see that the third author of the book was Allen Rohner. I worked with him at Compass Labs, an incredibly talented developer! He also founded CircleCI.
I wouldn’t use a bank with this attitude. With a “tech company” label comes a lot of unjustified hubris based on an assumption that one is good at everything just because they write code.
My employer called themselves an education research company that happened to use software to commercialise their findings. It sat a lot better with me and made for better culture in my eyes.
I have been using Wells Fargo for almost thirty years and while I don’t know if it’s still the case, as recent as four years ago their passwords were case insensitive. I’ll take a little sane fintech in my banking please.
If your answer is: because 62^8 (^12, ^16) is much bigger than 36^8 (^12, ^16) then I assume you don't actually know how people crack passwords, which isn't brute-force.
If your system is capable of being brute-forced, you already failed, because there's no reason to give someone enough guesses that a brute-force approach succeeds.
That particular detail is not dangerous in and of itself, but it is an embarrassing, public "code smell" because it indicates they are going cowboy mode and implementing weird ideas in their code even when it comes to authentication and cryptography.
yea one of the things I've learned at Fintechs - all that Cobol code banks run on.. yes it's from the past and hard to maintain but it contains quite a bit of invaluable knowledge that you're bound to relearn if you rebuild.. And relearning in banking can be very expensive
Serious question, and I apologize for the strong language: why should I give a shit about the language a service I consume is written in? Why does it matter it’s in Clojure? I am professionally a Clojure dev so it’s cool to see something written in Clojure like this, but why should I care about that?
This is one of the things I really dislike about the community. Yes, Clojure is a powerful language, and I really enjoy working with it, but I often feel that the community has some kind of imposter syndrome that drives the need to tell others and justify the use of the language in some project, and it’s weird.
This is a blog post by one Clojure company, interviewing another Clojure company, with a tech stack focus. It’s not unusual within any language stack to have these kinds of blog posts - they’re obviously by and for people who find the technology interesting.
I don’t get your point. Is this somehow not appropriate behavior amongst civilized society?
Your remark comes across as not a little bit ignorant. Maybe you should just learn that if you “don’t give a shit”, you also don’t have to engage, and let the author write whatever they want.
I think you misunderstand my point. I will admit that it was vitriolic, somewhat intentionally so, and conflated my general annoyance at posts that seem to virtue signal about the language they're using for their product, and this happening specifically in the Clojure community, but my point is not that it's inappropriate and I don't think I come across as ignorant here and so I disagree with that characterization. I also don't think it's fair to say I should just, paraphrasing, "sit down and shut up" if I have some criticism, because, taken further, it's a way to just dismiss criticism as negative even if it is constructive.
To be clearer, my point is two fold:
1. Why should I care about the language that a specific service is written in, be it Clojure, Rust, TypeScript, etc.? As a language user it's cool, but as a buyer (which, of course is not the target audience of this blog post) I don't think it matters the language as long as the product works well. For a bank, I couldn't care less if it's written in COBOL, Java, JavaScript, Piet, as long as my money is safe and transactions are handled correctly.
2. It feels like there are a lot of Clojure posts that go into the territory of putting Clojure on a pedestal, making it out to be the best language, etc. when in reality it is a tool, which excels at a lot of good tasks, but not every task, and while I may be biased it sometimes feels like the community needs to justify the existence of the language, that there's some kind of imposter syndrome. Maybe I would be less annoyed about this if I saw many/any posts about the pitfalls and difficulties of using Clojure.
Typically it’s most important for languages which haven’t yet hit the level of acceptance that people still worry about being allowed to use them. I remember back in the 90s PHP and Python developers sharing examples like that because it answered business questions like “why aren’t we using Microsoft’s ASP?”
People think of Python as ubiquitous now, but forget how common it used to be for there to be articles titled things like "How to Convince Your Boss that Python is Good For Your Business" on HN and Slashdot back in the day
U.S. banking is surprisingly far behind the curve, and hasn’t led the world from a tech or innovation perspective for many decades.
The UK on the other hand has actively encouraged new banks and new tech. It’s had things like instant, free, payments between personal accounts for almost two decades. Contactless transactions for at least a decade, mobile banking for decades, and government mandated banking API for almost 5 years.
In short, the UK has a very active banking sector that’s been rapidly (for banks) innovating for many decades. So the environment and ecosystem are well developed for further and faster innovation.
In the U.S., it seems banks gave up on tech innovation decades ago, and decided that innovation in fees and punitive treatment of customers was their preferred approach. As a result there just isn’t an environment for new innovation, the incumbents find it much easier to crush competition, rather than compete. Why the same isn’t true in UK (which until recently had remarkably few distinct banks), is probably down to the nature of law and regulation, which gives customers lots of rights, and actively punishes banks that don’t uphold them.
> The UK on the other hand has actively encouraged new banks and new tech. It’s had things like instant, free, payments between personal accounts for almost two decades. Contactless transactions for at least a decade, mobile banking for decades, and government mandated banking API for almost 5 years.
I think most of those things are EU initiatives, which the UK was a member of at the time.
It’s a mix. Many of these innovations came into existence before their EU counterparts. Faster payments, for example, is a completely different system to instant SWIFT payments which most of the EU uses.
Having said that, Denmark is another notable country for its innovation financial systems, ahead of the U.S., many of its EU neighbours and in some areas ahead of the UK.
But the UK has always had a vibrant and innovative financial services sector. Something the EU always had a bit of love-hate relationship with, and something Brexit has seriously damaged.
> But the UK has always had a vibrant and innovative financial services sector. Something the EU always had a bit of love-hate relationship with, and something Brexit has seriously damaged.
To be fair, most of the regulations were EU ones, but they were driven by the UK as a member, and they supported them nationally.
Contrast Ireland, which has the same EU regs but is much, much further behind the curve because our regulators focus on different things.
Counter-anecdote: my brother is building a startup (cashflow forecasting tool, check out https://tailwindapp.eu/) on top of those banking APIs. Granted, he's spent many years in the industry, but as I understand, API access was by far not the hardest part of it.
What do you mean it's impossible to get access to?
If you're a start-up wanting to share your customers' financial data, you can make an API.
If you're a start-up wanting to get access to your (potential) customers' financial data from other institutions, you can get access through one of the many providers (Tink, GoCardless etc), or become a provider yourself with enough work.
I thought we were talking about PSD2, which the European Parliament adopted in 2015. I recall many startups were launched specifically to capitalise on new opportunities provided by PSD2, so I find your comment confusing.
> It’s had things like instant, free, payments between personal accounts for almost two decades.
Thanks for making me feel old I was thinking "it hasn't been anywhere near that long, I remember the rollout like it was yesterday". Turns out that rollout was 15 years ago.
Not sure if anyone else can comment on prevalent software in banking. But it seems like IBM’s FTM is fairly prevalent. The underlying technology from a glance seems dated (Java 8, bunch of uncontainerized shell scripts running the show) and highly proprietary in the bad ways. E.g the proprietary stack hasn’t seen innovation. Add to that, the same teams deploying this software.
Getting a new banking charter in the US is hard, but it's straightforward to be a challenger bank in the UK. Jarvis moved (back) to London from SF to start Griffin.
The banks in the US that have APIs tend to focus on large fintech partnerships, so even simple APIs will be expensive compared to a regular bank account. Grasshopper Bank in the US (for example) is one of the few that will do APIs on top of regular commercial bank accounts.
(I work at Treasury Prime, which powers many US banks that do APIs.)
It’s hard in the UK too! Just not “infeasible” hard. (Plus, it’s easier to finance as you don’t have those pesky Bank Holding Company Act regs dinging your investors)
Do you mean Neobanks that generally attach to existing banks through partnerships? The OCC nearly stopped granting de novo banking charters since 2008: https://crsreports.congress.gov/product/pdf/R/R47014 (see Figure 1 on page 21)
Nah, real banks, but I did some looking and it seems they have state charters rather than national ones from the OCC. What the implications of that would be are rather unclear to me, other than that state banks are regulated by both the state and the FDIC, and national banks are regulated by the OCC.
After the financial crash of 2008 and the subsequent succession of banking scandals, in which the taxpayer bailed out multiple prominent banks, the UK's government at the time introduced a series of measures to promote smaller banks. For individual personal accounts, Monzo and Starling are probably the most well-known of these so-called 'challenger banks'[1].
Further more, banks that were bailed out signed a deal to actively help those smaller banks by transitioning clients to them. I know that because my bank offered me a few times to the point of nagging to switch my business accounts from them to one of the smaller ones. They offered to pay me a sum of a few thousand pounds to do that. I declined at the time due to me liking my bank and appreciating one app for all my business and personal accounts. When I pushed why are they willing to pay me large sums to leave them, the told me the above.
I learnt about this situation, as it happens, from the founder of Griffin, so was particularly interested in this blog post. It seems like the initial excitement of developing a banking system in Clojure might have transitioned to more bread-and-butter aspects of banking now, but immutable data is an ideal match for their mandatory audit trails.
I have to admit that I wouldn't mind being paid a large sum to switch to Griffin, but then again, my current provider is one of the more responsible ones which didn't have to be bailed out in the crisis :)
UK has government mandated API requirements for banks. In the USA it's left to the "free market" which in practice means having to trust unreliable 3rd parties like Yodlee or Plaid.
67m people in UK in a far more homogenous market. US is a lot more complicated, has a lot more legacy and more parties to get anything done.
See e.g. the sluggish move to chip and pin, sluggish death of checks, slow move to real-time payments. Hard to turn a supertanker with many states and thousands of banks.
The way our system works, the legal address of the end-user determines which bank the bank account is within -- this is largely due to regulatory requirements within the various banking systems.
There is one mechanism by which this could potentially be done from my understanding, but it will take us a few years to fully explore and integrate into that kind of model.
Another model we explored (but again, do not implement) is that instead of holding end-user funds in FDIC-insured bank accounts (in the US -- bank accounts in other countries are not FDIC insured) is treating the funds as a security backed against deposits, which is significantly different, but could make non-US users able to open US-based accounts, but they would not at that point be bank accounts.
The US banking world seems to be firmly stuck in the 80s and banks need to be dragged kicking and screaming to implement anything recent, so this doesn't surprise me at all.
The UK is friendly to money-laundering, presenting the more respectable faces of an international network of questionable banks and financial institutions. API banking will allow a faster cycle of wash, rinse, repeat, an innovation in shady.
Perfectly engineering is nearly impossible. In my experience systems always end up under or over engineered. The distinction also depends on what's on the backlog – if you have a feature that's going to be hard to implement because of inflexibility, you're under-engineered, if it's easy but other features require more work than they should, you're over-engineered.
I can't make a judgement call on whether they're over or under engineered, but it's good to pick which one you would rather be, write that into your engineering team's DNA, and then learn to work with it.
exactly this. I see it as this: over or under engineering can basically be boiled down to requirements and budget (which is itself a requirement, but many engineers don't seem to see it that way).
over engineering is anticipating for future requirements that aren't ever realized, or realized so far in the future that upfront cost outweighed the cost of putting it off for later. under engineering is not anticipating for future requirements that are realized before their cost outweighs the cost it would have taken to implement them upfront. somewhere there is a line, and on both sides you wasted money. if you perfectly engineered something, you spent the money optimally. in this industry, the idea you could perfect that is laughable. it's either over, or under engineered.
reading a blog post gives you fuck-all for insight into a project or team's actual budget and requirements. driving by and shouting "over engineered" is a very clear sign of lack of experience, and one of the most uninteresting things you can add to a conversation about a company _literally starting their own bank from nothing_.
that's just how FoundationDB is architected; it's not quite the same as building your own DB. the database provides primitives that are safe to build abstractions like this on top of.
it's also not quite the same as using an off-the-shelf RDBMS, but they do a pretty good job of explaining why they made that design choice.
if anything, viewed through that lens, I'd say it seems quite pragmatic.
“By law, fintechs must work with a bank to do these things, and right now that means a legacy high street bank with mainframes. Griffin is the bank plus technology platform that all future fintechs would build on.”
Did they write this in 2016? The market has moved on. Griffin looks neat but they’re years behind many others, a nice API for banking exists from well established players, like ClearBank. There’s room for more so it’s great to see Griffin join the market but I hope their pitch isn’t this weak.
There still isn’t a ton of choice in market - ClearBank is one of three other banks that actually have decent offerings. And having a good API is not enough; you need a holistic operating model that is aligned with your customer base, which is much harder to build for.
I'm confused by this statement. fintechs aren't just proxies for banks, they're financial products that people want to use. There are plenty of cool and innovative things you can do with just a banking API beyond just creating another bank
Sure the costs for any fintech using these products are the same, but they can offer a million different types of products which make them unique.
This is true for basically everything powered by an API.
I mean if you want to start your own bank and deal with compliance then sure do that! I don't see your point here, there is no sensible alternative to this problem.
Well, not yet. "The training wheels come off when we complete an audit, raise some more money, and finish writing the code. That should be Q3 or Q4 this year."
I absolutely hate when an articles starts out with:
> IN A STARTUP, YOU SHOULD BE USING THE MOST POWERFUL LANGUAGE YOU CAN, AND THAT IS CLOJURE.
says you. one opinion. in a startup you should use the language that your team can build and launch your MVP the fastest so you can get your first customers or funding. Hell, this could be a low code or no-code platform(probably not in fintech, just saying with startups in general).
I mean, I'd say Python is the most powerful language because of LLMs and machine learning and generally I'm a PHP dev. Python also is about to get a ton more powerful with mojo which supposedly makes Python 36000x more performant.
However, I would never tell someone x is the most powerful language and only one to use for a startup because that's a total lie and opinion.
Of course it's an opinion. Anytime somebody says something, it's his/her opinion.
My opinion, for example, is that I agree with his opinion :-) My solo-founder business would not have been possible without Clojure and ClojureScript, which is a testament to the "power" of this language. I consider it "powerful", because it lets a single developer write and maintain a complex app over the years. It gives me power.
That catchline makes a lot more sense when you consider that JUXT is the most prominent company specialising in Clojure aside from Nubank itself, and have their logo at the foot of pretty much any conference remotely touching on Clojure topics.
Clojure has a fairly insular community that overlaps with other lisps more than it does something more popular such as Python or PHP, so the cliché (admittedly, one with some truth in it) that Clojure is the most powerful language might be surprising to someone who doesn't use it. We Clojure developers are used to it though, and stating our presumed superiority is almost a greeting at this stage!
> stating our presumed superiority is almost a greeting at this stage!
I think you're joking, but as another person in the Clojure community, please don't do this for real. The community already has issues with feeling "elite".
Rest assured, I don't lord it over others :) Plenty do though, and those are who I was poking fun at: those who ritually exchange platitudes extolling the virtues of Clojure, Datomic or even Rick Hickey before the start of every discussion.
I’ve seen a few other projects around here that boasted about using clojure because “we are the smartest and clojure is best and everything else is dogshit”
Doesnt make your insular group look inviting or synpathetic at all to be honest…
I get it. I am a Clojurist that works for a fairly sizable company where Clojure is dominant (there actually are many, not just JUXT, you could also name Walmart). But I get annoyed when people substitute "we are the smartest and Rust is best and everything else is excrement" or especially "Typescript is best..." rofl
Python is very powerful indeed, especially in 2023. But for some reason I orchestrate my LLMs and diffusion models using Clojure. I also prefer to tame LLM output using Clojure. I still definitely write Python where it makes perfect sense.
>If you had a database and some separate messaging like ZeroMQ, Rabbit, Kafka, you’re going to get race conditions. There’s always going to be a potential race where you have two messages - one is going to the disk and one is going to the network. If someone else is listening and sees both of them, they could potentially get them in either order. So it’s much, much simpler to just have one path. Foundation is really fast, so it works great.
Having difficulty parsing this justification. How will a client code listening to an event get two events - one from network, and one from disk in case of any of the queuing systems mentioned here? I am also curious to know how the FoundationDB act like a messaging system apart more than a message store?
I think he's talking about how with FoundationDB, pub/sub notifications ("watches") are transactional – the notification is only sent if the transaction succeeds. because FoundationDB can guarantee transactions across multiple key writes, you can write your state at the same time you write an event/message/job somewhere else, in one transaction, with the subscriber only being notified on success. I think this is impossible(?) to achieve without the message queue and data store being the same service.
otherwise, you have a transaction to write to your database and a separate call to the message queue, which introduces the race condition.
If you have access to an API to process BACS transfers, you might be able to avoid having to pay percentage VISA or MasterCard fees. For any non-trivial sums of money, Griffin's fixed-rate pricing is rather competitive. The only real downside of transfers at the moment is needing the human in the loop, whereas card payments have already had fully automated platforms such as PayPal, Stripe and Opayo for many years.
Its not clear there will be small banks in the US to use this tech in the future. Here's why:
1. Deposit flight. Small banks are struggling to attract deposits at a time when money markets pay considerably more than deposit accounts.
2. Investment scarcity. Small banks are struggling to find places invest deposits where returns are safe, on a reporting adjusted basis. Meaning, small banks invest(ed) in US Treasuries, but as the Fed has rapidly hiked rates, the mark-to-market value of these investments has declined (though the funds are safe). The same can be said of new lending, which has similar problems due to credit risk, primarily in commercial real-estate.
3. Perceived risk. Rather than allow depositors to purchase additional FDIC insurance (above the $250k limit), which insurance would generate revenue for the FDIC and banks, the USGOV had done nothing. If you want to insurance excess deposits you must use an IntraFI account that spread deposits among different banks, and/or brokered CD's which does the same for that banking product. Both are higher friction, and both prevent instant access to
funds - regardless of one's tolerance for penalties.
4. FISERV missteps. Small banks rely on Fiserv and a few other vendors to for back office and retail banking systems. In Fiserv's case their software is well behind big bank products that provide more features and easier user interfaces. Younger banking customers prefer big bank systems.
5. Regulatory overreach. Complex issue, but in sum its not clear bank regulators understand the banking business, or if they do, that they have any flexibility. By way of a few examples, its not clear to me regulators understand the safety & utility of brokered deposit use by small banks; the worthlessness of many bank capital asset appraisals, or the immense risk real-estate heavy loan portfolios are facing.
6. Asset-based lending death. Pre-2008 one could borrow against many types of capital assets, allowing businesses to purchase those assets with, say, 20% down. The asset secured the loan, with little or no reliance on personal guarantees or recent cash flow analysis. This practice led to problems in the 2008 GFC, but instead of tweaking the approach, lenders and regulators simply eliminated asset-based lending. When I write 'eliminated' I fully understand its still advertised as a lending product, but underwriting is reliant on cash flow, personal financial statements, and other factors that make 'asset based lending' a marketing term, not reality. This was profitable business for banks and didn't need to die.
More threats lie ahead for small banks in the US. Folks in the know speculate USGOV prefers the Canadian model, where most deposits are held by several big banks. It might be the case FedCoin opens new possibilities for small banks, reducing Fiserv and friends hold on small banking business. One can always hope.
We could in theory bank other banks, but that's not our focus. Our current customers are mostly regulated non-bank financial institutions, E-Money and Client money. As an American analogy, think Square Cash, Venmo, Paypal.
Longer term our aspiration is to also be the business bank for technology companies e.g. Apple or AirBnb or Uber. Currently Goldman is a big player in that space.
I've always been interested in these fintech APIs but wonder what competitive advantages the companies that build on this have compared to one another. What differentiates a company building on this from another company building on the same API? It seems like the execution would differentiate them the most? I feel like I am missing something here. Could someone enlighten me?
Every time I see ZeroMQ being placed alongside RabbitMQ or Kafka as an example of a message queue system, it makes me wonder if the person who wrote that really knows what they are talking about. If they would rtfm they’d understand that ZeroMQ is more of a transport feature, that doesn’t really intersect with other mentioned projects in its features.
The issue is always the same: traditional banking is frictionless, very cheap and convenient for 99.9% of people out there.
Crypto isn't frictionless, it's not cheap and it solves problems that only the 0.1% has such as the need of transferring large amounts of "money", without tracing, in short times, across the globe but it undoes the benefits that the 99.9% has while bringing complexity.
Plus, nobody really wants a deflationary currency, they don't work, the incentive is to hoard and hold, not spend, which is why even Bitcoin's website dropped the currency narrative for the store of value.
People don't buy Bitcoin to trade or pay, but to sell at a higher price. Exception being drug dealers and few lunatic anarcho-capitalists.
I'm tired of hearing the "works on my machine" line about banks.
Here's a few issues I've run into in the last few years:
- I dissolved a corporation in Korea and it took nearly 1.5 years and $8k USD in lawyer fees to get ~$80k USD out of the country. My other option would've been to go back to Korea and buy Rolexes.
- I moved to an Eastern European country and spent several weeks providing KYC/AML docs after attempting to send $30k from the US to a local bank account while my funds were in limbo. Ended up having to use ATMs to pay rent and other expenses.
- Bought a house in said Eastern European country and almost lost the deal because of several bank delays. Seller was pissed and had to involve lawyers to calm him down. Technically I should've had to pay extra because of the delays, because of the way real estate deals are structured in that country.
- I moved to Latin America and went to probably 6 car dealers who all told me that purchasing a car would take up to 2 months because of the KYC/AML process involved in wiring money from the US. I ended up going various ATMs 40 times over the course of several weeks, withdrawing $500 at a time, so I could buy a used car off Facebook marketplace. Had to take over $20k in cash in a backpack to a lawyer's office to complete the deal. Spent way more time & money on my rental car than I wanted to.
- Earlier this year while visiting the US I went into a bank branch to withdraw $20k to purchase a used car for my dad and not only could the teller not help me, but it triggered a flag on my account that caused all of my funds to be frozen for several days until I was able to get it resolved. Ended up borrowing money from a friend so I could buy the car before I left the US.
- Sending even $500 to my bank account here in Latin America takes over a week with various questions from bank staff. Every time I worry if the funds will even arrive because it has to go through multiple intermediaries with various memos/notes to the intermediary receivers. From my US bank account to another bank in NY, credited to some European account holder to send to Europe with further instructions on how to get the money to me in Latin America. I bank with an institution that's run by the national government; I figured they'd have better international finance connections than any of the commercial banks, but apparently not.
- A couple weeks ago I decided I'd send $50k to my account here because I don't want to go through this processes every time I want to pay bills. 2 weeks later and my $50k is still stuck. The bank keeps coming back with more questions and requesting more documents every few days. Nobody at the bank can tell me how long it will take to release the funds.
I've got more stories like this, and I know many expats with similar stories. I'm fortunate that I even have US financial access, and that I'm not living in a country like Lebanon, Turkey, Argentina, Venezuela, or even worse Russia, Iran, Cuba, etc. Things are even more difficult in many other countries.
By the way, Western Union works great here and is super fast, but they take huge commissions. Funny how that works.
The traditional financial system works great if you live in a first-world financial bubble and all of your friends, family, business partners, etc. also live in first-world financial bubbles. Meanwhile some of us are still on the financial equivalent of dial-up while the typical HN poster is on gigabit fiber talking about how great all of the bloated 20MB web apps are for them. I'm sweating bullets every time I deal with a bank, while people on HN say "this shit is great!"
I'll be beyond happy if the Bitcoiners are right and the whole world eventually switches to a global permissionless currency.
Bitcoin has existed in the last few years. So I have to ask why it is that you haven't used it if you've had these issue moving money via traditional banks. Are the transaction fees too high (like Western Union).
I take part of my income in Bitcoin and spend it, both on-chain and Lightning Network, pretty frequently. I visit a number of local places that accept Bitcoin and use it for online purchases too. There are several Bitcoin ATMs in my city but they mostly have KYC with low limits and/or high fees.
For purchases like cars or real estate it seems like you'd have to be pretty lucky to find a seller unless you're not particular about what you're buying.
I've also started keeping more physical cash on-hand lately.
This is the real problem: if you did the same transactions in Bitcoin, that doesn't free you from the KYC/AML obligations. It just hides the fact that you're not checking.
- American banking is much worse than most of the world's (but not compared to crypto's risks of 'being your own bank'), especially for international transfers.
There is a perception that it's magic and untraceable. As you mention, there are obfuscation methods that seem sufficient that criminals have adopted crypto rather than 'Okay, you're gonna have to buy $100k in iTunes cards" to claim their ransomware bounties.
Unless finance institutions and laws can protect wallets and insure crypto, I will never use it. Currency is not just a number you move around, currency is deeply intertwined with how civilization works, and it requires a high amount of trust for people to use it.
Trust is rather a social concept that something you can really prove mathematically. It's collective trust. Unless you can DRASTICALLY reduce mis-use with bitcoin, no one with trust it ever without a lot of expensive protective measures.
A 1 hour lecture from a security professor on crypto currency:
I don't understand why this would be a serious impediment.
Suppose you go to an online exchange or Bitcoin ATM or local man with a crypto wallet, you hand over five $10 bills or transfer some money from your bank account, you get $50 in cryptocurrency.
Now you go to the adult bookstore's website and don't have to give them your name, you send $20 to your friend in a foreign country so she can get a cab to the airport without having to lose another $15+ to a wire transfer fee, or you just use it to pay for parking.
Even if you screw up entirely, the most you're out is $50. But then nobody is tying your literature preferences to your name, you can do your friend a favor more efficiently, and the credit card company doesn't have a database of where and when you park your car.
The trust issues come when you want to be a speculator holding a large amount of wealth in a form someone could steal by breaking into your computer, but what does that have to do with using ordinary amounts day to day as a currency?
$15 to transfer $20 to another country, the US really is a another world sometimes. Transfers are often free to cheap - substantially lower than the costs for transferring crypto in most cases.
US banks commonly charge ~$40 for an outgoing international wire transfer, even for small amounts. There are less expensive alternatives but they may not allow you to withdraw the funds the same day or be available to the recipient in that country.
The average transaction fee for Bitcoin is currently less than $1. For Bitcoin Cash it's less than $0.01.
The seller does not need to be anonymous, and customers would be less inclined to trust them if they are. At which point if they're ripping people off they would be subject to arrest for fraud, not to mention have immediate bad reviews.
More than that, sellers like repeat business, especially in low variable cost digital products. It costs them a tenth of a cent worth of bandwidth to actually send you the book, which is the only way they're going to get your money for the next book.
They want to give you your book, because they want your money.
With the introduction of CBDCs, FedNow, and platforms like TFA, it's starting to look like TradFi is getting the second-mover advantage. Cryptocurrency introduced programmable money, which is great, but it also came with other features like self-custody, extreme transparency & privacy, and immutability that have ended up being more than average users are willing to accept as a bundle.
TradFi entities now have the ability to pick what they like out of the mix and offer that to customers while also benefiting from the convenience of trust assumptions, something cryptocurrency eschews. TradFi is building atop thousands of years worth of UX improvements in how people can come to trust each other. It's difficult for cryptocurrency to compete there.
I still love the developer convenience of blockchain since it nicely combines serverless with auth with payments. But for the most part, given the existence of trust, these benefits could also come from a system like in TFA having a Wasm runtime and maybe a dash of WebAuthn. Like a mashup of Cloudflare Workers and Stripe.
You can build features like reversibility on top of Ethereum. A draconian government chain that solves none of the important grievances is not needed. I do not need big brother freezing my funds, deciding what I can and can't buy, etc.
Yes, I completely agree. There's plenty of tech available to achieve many of the goals of functionality, affordability, and privacy a motivated team of developers could have. Just that it's often unnecessarily difficult to build and use. Probably things will be much better in a(nother) decade, but the whole thing is still a work in progress. In the meantime, why pay the cost 100% of the time for avoiding a bad thing that happens 1% of the time? Cryptocurrency has its utility, but much less when minimizing trust isn't a requirement.
I put 100k EUR of Bitcoin into a wallet, but then I lose the private key. There has to be a way for me to recover access to my funds. The only way to do that is to delegate trust to a central authority. Crypto will never, can never, be an alternative to traditional banking for the general population.
You're right, this is a massive weakness in the cryptocurrency space and until this gets better solutions I doubt there will be more mainstream adoption (without centralised custodians). But I will say that this isn't only a problem in the cryptocurrency space, it's a general problem for cryptographic systems. We need better generic systems for managed cryptographic keys, with several levels of backups and recovery options, with minimum hassle. ERC-4337 standard goes in the right direction.
I think we already have a pretty good system for managing access to wealth/value, which has several levels of safety, and is relatively low-hassle. I think it's the existing banking infrastructure.
Interesting idea. This was the sales pitch that drew me to Mondo (now Monzo) but they ended up dropping their user-facing API almost entirely and replaced it with Open Banking APIs that aren't available to regular hobbyists.
How does this compare with SoFi’s Galileo platform just in terms of feature set. The Galileo API is fairly rich so I’m wondering what’s different and potentially better.
Not disagreeing, and it's only one aspect of Phoenix, but it might be of interest to someone reading that this LiveView-like Clojure library exists: https://github.com/tatut/ripley
Unfortunately in the UK a bank can (and do) freeze your account with no reason for up to 2 years without any recourse or access to funds. With API access you risk your money being frozen due to automatic fraud detection. The government have given up policing of finance to banks and they take the least risky options
I wouldn’t say “given up” as much as “actively delegated”.
The UK has essentially made a (political) choice that rather than cover the cost of policing finance through taxes, they’ll just make the banks sort it out for them.
This is not great for some people for obvious reasons. But for everyone else it’s lower taxes. Where one falls on the political spectrum largely determines how one feels about this.
> In 2015, Eric Holder ended the policy of "adoptive forfeiture", which occurred "when a state or local law enforcement agency seizes property pursuant to state law and requests that a federal agency take the seized asset and forfeit it under federal law" due to abuse.[21] Although states proceeded to curtail the powers of police to seize assets, actions by the Justice Department in July 2017 have sought to reinstate police seizure powers that simultaneously raise funding for federal agencies and local law enforcement.[22]
basically Obama got rid of it then Trump put it back then Biden didn't anything
These "challenger" banks are a gimmick. Friend of mine recently tried to set up an account for his new business and all "challengers" refused a business account, they were slow to respond and disinterested in helping. Eventually he set up with a high street bank, but the whole thing took over a month, which is ridiculous.
I set up an account with HSBC about seven years go, and they were a nightmare. It took over a month to open the account, and everything had to be done in branch. Also, HSBC in the UK is not the same HSBC in other countries. I once had to fly back to London from Athens just to "pop in to my local branch" for some uninteresting bureaucratic process.
I closed that account and switched to Tide. It took me like 15 minutes to open the account online.
That's technically true. Most of their customers have a business account with a different bank under the hood. In my case, I have an e-money account which is safeguarded in line with some regulation, but isn't protected by FSCS.
For my [meagre] business banking intents and purposes, the difference is immaterial, and Tide is a bank.
Exact same experience, I wouldnt say refused just hoop after hoop to jump through. disinterested in helping is an apt description, they got interested 2 years later and I would get a daily phone call about "finishing my application".
I was pretty transparent with the person(s) calling - after years of doing their best to push business customers away, someone up high decided to make to get as many business customers as possible and everyone was on a fat commission to open up new accounts (I think lure of government guaranteed covid loans might have had something to do with it).
High street bank was open within 3 days, fill out the form and done.
I was referring to the inability of the government to freeze your funds. But there are ways to prevent surveillance using zkproofs where you can break the onchain connection for private transactions.
Putting your eth wallet keys on a “do not transact with” list would be pretty similar. Having to engage in “money laundering” such as those privacy measures you bring up being a reason the government may put an address on the block list would make it quite difficult to get around too.
That is assuming Ethereum sees that kind of mass adoption to the point governments start to regulate its usage.
Is there actually a "block list" of this kind? It seems like it couldn't possibly work, e.g. there is a huge foreign entity that does zillions of honest transactions but isn't in a jurisdiction that subjects it to a given block list and it gets a transfer from some wallet that is. Either you now block anyone who transacts with them in turn and by Six Degrees of Kevin Bacon therefore the entire world, or transfers from "blocked" wallets go in there and come out non-blocked.
This is what all those chain analysis companies claim to solve for, and allegedly law enforcement agencies have done in the past to de-anonymize people. Perhaps you are right about it not being feasible given enough volume and so on.
But at that point, if a government level threat actor wants to stop you from doing something and you are in their jurisdiction, jail cells tend to be quite effective at limiting people from spending money. If using non-government sanctioned cryptocurrencies is a crime in the future, than it doesn’t really matter if it is perfectly privacy preserving or not does it if you need to pay taxes in not the privacy preserving crypto.
> This is what all those chain analysis companies claim to solve for, and allegedly law enforcement agencies have done in the past to de-anonymize people. Perhaps you are right about it not being feasible given enough volume and so on.
De-anonymizing people who are not attempting to prevent it is fairly trivial. You tie one transaction to their wallet and they use the same wallet for everything. But that's an entirely different question than whether you can have a "block list" for wallets.
What do you do when people in other jurisdictions pay no attention to the block list and then someone uses the money in the "blocked" wallet to buy some other fungible commodity there?
> But at that point, if a government level threat actor wants to stop you from doing something and you are in their jurisdiction, jail cells tend to be quite effective at limiting people from spending money.
But then it has nothing to do with the money at all. "You can't sell heroin for cryptocurrency because heroin is illegal" applies just as well to cash or gold or iTunes gift cards.
And then, in a country with rule of law, they have to prove your guilt beyond a reasonable doubt in a court of law. The original problem was a random bank's fraud algorithm unjustly stealing your money as an innocent person.
I will concede the concern that the bank won’t be able to just take your money in the case of using ethereum, but being unable to legally spend your ethereum in the country you physically reside in complicates things. And buying things over the internet would get the government to wonder how you can afford these things and how you will pay your taxes, thus investigating your finances and discovering the proscribed crypto wallet. Capone got busted this way and cash is 100% fungible and private.
> being unable to legally spend your ethereum in the country you physically reside in complicates things
But you can legally spend it, because nobody has found you guilty of a crime, because you haven't committed a crime.
The existing problem is that you don't have to be convicted of a crime to have some payment intermediary extrajudicially lock your account. To which the solution is to not have a payment intermediary.
Then if the government wants to stop you spending your own money, they can accuse you of a crime, but then they would presumably have to prove that you have committed the crime that makes it unlawful to spend the money instead of just letting the bank steal it from you without due process.
Yes, let's just give in completely then and give them the ability to freeze any funds they want. That's definitely the logical approach. I gave up with privacy because I can't stay 100% anonymous online against a motivated adversary so now I just install government spyware with root access so they can watch me all the time and control my user accounts!
One can be anti-extrajudicial government powers and anti-cryptocurrency. For one, I believe cryptocurrencies will be the government spyware of the future that allows for total control over our finances, since crypto evangelists push “code is law” and the blockchain is immutable and completely transparent. There will be no privacy in such a system. We could spend our efforts making the current financial system be private and secure instead of doing nothing to fix that problem and utilize a system that gives them even more insight into your finances.
> One can be anti-extrajudicial government powers and anti-cryptocurrency.
The trouble is the current system has failed to provide both of those at once, because of the incentive structure. The government wants the intermediary to try to prevent fraud or crimethink or what have you so they don't punish the intermediary for making mistakes, so the intermediary makes lots of mistakes and tramples over innocent people without enough power to push back.
In other words, it's much harder to have a non-abusive intermediary than to remove the intermediary.
> For one, I believe cryptocurrencies will be the government spyware of the future that allows for total control over our finances, since crypto evangelists push “code is law” and the blockchain is immutable and completely transparent.
This is true of Bitcoin to an extent but there are at least two ways to avoid this.
One, you can have a blockchain that doesn't work like that but still doesn't subject you to the whims of an intermediary, e.g. Monero.
Two, a blockchain like Bitcoin is completely transparent, but it also doesn't tie your name to your wallet or limit you to a single wallet, so in principle you can use a separate wallet for separate activities without any way for someone with only access to the blockchain to correlate them. Apps could make this easier to automate.
You could also just use a third party payment processor which is only using cryptocurrency for settlement between payment processors, so the individual transactions aren't being recorded on the blockchain. But then you don't have to, which is important to give the user leverage over the intermediaries, and make it easy to switch because anyone who accepts Bitcoin accepts Bitcoin regardless of whether or which intermediary is used to send or receive it.
The solution to, the government is extrajudicially spying on our transactions and without due process blocking people from their finances, is not a system that uses an immutable public ledger. It’s reforming the law to protect people’s right to privacy and due process.
Ignoring solving the problem of the government not respecting people’s right to privacy and due process is not acceptable to me in my opinion. Saying Ethereum has private transactions isn’t any different than telling someone whose bank account was frozen without due process that they should just use cash.
Imagine being fascinated by a JVM stack trace instead of frustrated that your online banking services are partially unavailable because of a software error.
In OO languages (Java, C#, C++, etc...), and functional ones (F#, Haskell, OCaml, etc...) types do not validate the correctness of logic, they evaluate the correctness of data structures and their access/mutation as they are manifested in the language.
OO languages consider data correctness as access patterns of encapsulated primitives (or other data structures) through defined behaviors on a "class".
Functional languages consider data correctness as access patterns which preserve previous version of data which a caller may still need to reference, or another part of the system references. Functional languages (in most cases) disallow direct mutation without some sort of immutability or persistence.
Types have little (or nothing) to do with program correctness, or data correctness as it relates to external storage engines and their concurrency guarantees (i.e. FoundationDB, PostgreSQL, MySQL, etc...).
Therefore, in this scenario, what matter's most is the correctness of underlying storage (including the rigor of validating expected behavior) and the event sourcing/messaging systems, not on the internal data structures and their idioms conveyed by the language.
> In OO languages (Java, C#, C++, etc...), and functional ones (F#, Haskell, OCaml, etc...) types do not validate the correctness of logic, they evaluate the correctness of data structures and their access/mutation as they are manifested in the language.
Nonsense. Types validate whatever you use them to validate, which can certainly include what we usually call "logic".
> Types have little (or nothing) to do with program correctness
On the contrary, they're still the most effective technique we've found for improving program correctness at low cost.
> Types validate whatever you use them to validate, which can certainly include what we usually call "logic".
I'd love an example of this! I concede that I could be wrong on the point's of ML/Haskell families, however, it relies on the practitioner correctly using the type system to the extreme (at least, that is my impression). C++ and other similar OO's, the type system isn't as compelling as a correctness measure.
> On the contrary, they're still the most effective technique we've found for improving program correctness
In which domain are you working in where this has been the case? It may be my experience, but types as I have seen them used in industry have been more as "data containers with some behaviors".
I'd appreciate some examples of where you think I may be getting types wrong or missing the point.
> I concede that I could be wrong on the point's of ML/Haskell families, however, it relies on the practitioner correctly using the type system to the extreme (at least, that is my impression).
I wouldn't say it's "extreme", it's very normal and natural. You just stick everything in the types and it works.
> C++ and other similar OO's, the type system isn't as compelling as a correctness measure.
> In which domain are you working in where this has been the case? It may be my experience, but types as I have seen them used in industry have been more as "data containers with some behaviors".
> I'd appreciate some examples of where you think I may be getting types wrong or missing the point.
I've worked in "regular industry" and found types to be very effective. Let me turn it around: what kind of logic errors are you seeing that you think wouldn't be eliminated by using types? Types can't help you avoid errors in the specification itself, and there are a few domains where they may not yet be practical (mainly math-heavy things like linear algebra, where there's centuries' worth of mathematics that's applicable except where it isn't, and we just don't capture all of that in practical type systems yet), but the vast majority of the time you can construct your types in ways that force your logic to be correct because you just don't offer the ability to do the wrong thing.
> I wouldn't say it's "extreme", it's very normal and natural. You just stick everything in the types and it works.
Ah, then I seem to be missing the point/intention. Thanks for illuminating that it should feel "natural". I think I need to spend more time with ML/Haskell families.
> ... the vast majority of the time you can construct your types in ways that force your logic to be correct because you just don't offer the ability to do the wrong thing
I think that is where I haven't had experience, nor much exposure to, when it comes to "static typing" in the true sense/intention of making a program correct. More so (which I think I loosely alluded to), is that the type system was being used to "model objects in the real world".
> what kind of logic errors are you seeing that you think wouldn't be eliminated by using types?
Classic "logic" stuff. Forgetting to modify and return a map given some other information. Accidentally returning the inverse of a boolean (i.e. !isSomething(x) vs isSomething(x)), incorrect adds or bit shifts, concurrent access to shared data structures (shared memory may be a more apt term), reconciling two different pieces of data into a shared one, algorithmic implementations (though I think you touched on this being less likely when using a type system designed to encourage correct algorithms by applying type theory, so it may be moot).
To maybe give some clarity to what I mean by correctness, I mean does the program match the expected behavior of the programmer. In playing around with F#, I've written buggy F#, but the types all matched up. I had a guarantee that my program would run in a non-faulty manner (bar any system fault), but the program was not correct.
> and found types to be very effective
I'm very curious where you've seen this be very effective (if you're able to share), and with what language/technology.
---
Also, as a last quip, I am enjoying learning about your perspective with type systems. There are some points you have brought up which have caused me to think harder and in more depth about the concept and it's application, and want to voice that I appreciate you putting in the time to have this exchange. Want to this up before I forget, since it is getting late in my neck of the woods.
> Classic "logic" stuff. Forgetting to modify and return a map given some other information. Accidentally returning the inverse of a boolean (i.e. !isSomething(x) vs isSomething(x)), incorrect adds or bit shifts
If something really is just a map or a boolean or an integer then you can't avoid this kind of thing. But usually it isn't, it's something meaningful in your domain, and then you can make and enforce the right distinctions. CachedResults is not the same as FullyPopulatedResults (and neither is just a Map); rather than x and a flag for whether x isSomething, how about an Either XThatIsSomething XThatIsNotSomething? Adds and bitshifts are a wide range of operations, and if you're doing deep numerical work then that's one place where I've found that existing type systems often can't keep up, but a lot of the time you're in a domain where you don't need to do that - e.g. if you only need to add values then you can use a type that wouldn't even allow you to subtract them (indeed most of the time I see people using integers they're opaque IDs that it never makes sense to do anything mathematical with - it would be nonsense to multiply a user ID by a permission group ID, so why expose them as integers?).
> I'm very curious where you've seen this be very effective (if you're able to share), and with what language/technology.
Various industries (finance, adtech, messaging), largely "backend" whether direct API backends or more batchy/offline "big data" processing, but also even web frontends. Mostly Scala.
> If something really is just a map or a boolean or an integer then you can't avoid this kind of thing. But usually it isn't,
No matter how much you abstract out and dress up a boolean, at its heart it's still a boolean value and I don't see how making a custom type based on boolean would prevent returning the inverse of a boolean value bug.
Most of the time a "boolean" isn't a boolean - it's a flag to indicate one of two different things, which is something you can represent more directly.
You brought up the examples of doing the wrong thing with booleans. With a good type system you can cut down on needing to use booleans in the first place for a lot of things. Your isSomething(x) example is a good one: presumably some code after this is implicitly relying on this predicate being true about x. If you forget to do the check, or you invert the check, then that's a bug.
But another way to do this is to encode the predicate into the type system, so the compiler makes you get it right. Concretely, supposing x is a string, and you need to check if x is a valid username before invoking username-related code on x. Then you can have a function like:
fn as_username(x: String) -> Optional[Username]
A Username is just a type alias for a String, i.e. the runtime representation is the exact same, with no overhead. You put the parsing/validation logic inside that function. Then code expecting a username will take a value of type Username rather than String. If as_username is the only function with Username in the return type, then having a value of type Username is proof that the as_username function was already called at some point previously, and gave its blessing to the underlying string so that the Optional could be unpacked.
match as_username(raw_string) {
// compiler forces us to handle both cases, ie we can't forget
// to check validity
case Some(username: Username) {
// code in here can assume we have a proper username
}
case None {
// handle what to do otherwise
}
}
Sure, you have to write the as_username function correctly, there's no getting around that. But you only have to get it right once.
> Are you familiar with the idea of "making illegal states unrepresentable", and "parse, don't validate"?
I haven't heard of those concepts/ideas before. Thanks for linking the article to define those concepts. With your example, and the article mentioning that "parsing should take place at the boundaries" (paraphrase), I can see how types (a la ML families) can be defined and composed give internal coherence once an external input has been parsed and hence validated.
Really interesting approach which I haven't considered before!
Agree that you can use types to express and prove logical properties via compiler; it can be a fun way to solve a problem though too much of it tends to frustrate coworkers in JavaLand. It's also not exactly "low cost"; here's an old quip I have in my quotes file:
"With Scala you feel smart having just got something to work in a beautiful way but when you look around the room to tell your clojure colleague how clever you are, you notice he left 3 hours ago and there is a post-it saying use a Map." --Daniel Worthington-Bodart
> On the contrary, they're still the most effective technique we've found for improving program correctness at low cost.
This is not borne out by research, such as there is any of any quality: https://danluu.com/empirical-pl/ The best intervention to improve correctness, if not already being done, is code review: https://twitter.com/hillelogram/status/1120495752969641986 This doesn't necessarily mean dynamic types are better, just that if static types are better, they aren't tremendously so to obviously show in studies, unlike code review benefit studies.
My own bias is in favor of dynamic types, though I think the way Common Lisp does it is a lot better than Python (plus Lisp is flexible enough in other ways to let static type enthusiasts have their cake and eat it too https://github.com/coalton-lang/coalton), and Python better than PHP, and PHP better than JS. And I prefer Java to PHP and JS. Just like not all static type systems are C, not all dynamic type systems are JS. Untyped langs like assembly or Forth are interesting but I don't have enough experience.
I don't find the argument that valuable though, since I think just focusing on dynamic vs static is one of the least interesting division points when comparing languages or practices, and may be a matter of preferred style, like baseball pitching, more than anything. If we're trading experience takes I think Clojure's immutable-by-default prevents more bugs than any statically typed language that is mutable by default; that is, default mutable or default immutable (and this goes for collections too) is a much more important property than static types or dynamic types. It's not exactly a low cost intervention though, and when you really need to optimize you'll be encouraged by the profiler to replace some things with Java native arrays and so on. I don't think changing to static types would make a quality difference (especially when things like spec exist to get many of the same or more benefits) and would also not be a low cost intervention.
Some domains also demand tools beyond type proofs. i.e. things like TLA+. I think adding static types on top of that isn't very valuable, similar to adding static types on top of immutable-by-default isn't very valuable.
Last quip to reflect on. "What's true of every bug found in the field? ... It passed the type checker. ... It passed all the tests. Okay. So now what do you do? Right? I think we're in this world I'd like to call guardrail programming. Right? It's really sad. We're like: I can make change because I have tests. Who does that? Who drives their car around banging against the guardrail saying, "Whoa! I'm glad I've got these guardrails because I'd never make it to the show on time."" --Rich Hickey (https://www.infoq.com/presentations/Simple-Made-Easy/)
Your point about underlying data structures is spot on, in my view.
> Last quip to reflect on. "What's true of every bug found in the field? ... It passed the type checker. ... It passed all the tests. Okay. So now what do you do? Right? I think we're in this world I'd like to call guardrail programming. Right? It's really sad. We're like: I can make change because I have tests. Who does that? Who drives their car around banging against the guardrail saying, "Whoa! I'm glad I've got these guardrails because I'd never make it to the show on time."
I've been avoiding bringing up this point/example, but this is what prompted me to start thinking more deeply about types and their tradeoffs. Great talk, IMO.
Quips are not a good basis for decision-making. Immutability is valuable but there is no conflict between immutability and types; you should use both. As for "bugs found in the field", having switched to Scala I pretty much don't encounter them.
I wish people would stop bringing up Nubank when it comes to Clojure/FP as if it's some sort of mic drop that proves that it's the Right Way to develop financial software. When you have to keep bringing up the same example, it's not a very good argument.
There are always going to be niche langauges and mainstream languages, and as a result, users of niche languages will always have to "justify" their use of such. To investors, to potential new users, to potential colleauges.
They're not having a bash at $your_favorite_langauge. Or suggesting that nubank couldn't have been built on any other platform. They are simply responding to the oft-stated question "are there any examples of it working at scale".
What do you care whether it's "the right way" to develop any kind of software? "Clojure" was in the title. If you don't like Clojure, skip along to the next article.
Sure, I get that they have to justify themselves, but when the answer to that question is (almost) always "Because Nubank!" for the past 10 years it doesn't feel like much of a justification.
Clojure you can express arbitrary data constraints and pre/post conditions (design by contract), with spec or malli.
Clojure is also strongly typed, has (optional) type hints, which are understood by linting during dev time and helps the compiler to avoid reflection and boxing.
It’s a very robust language that people use for critical stuff.
https://youtu.be/C-kF25fWTO8?si=PnjMNLdBLJ8zqSu-