Hacker News new | past | comments | ask | show | jobs | submit | leriksen's comments login

That's not what he means when he says multi-cloud.

He means his product supports multiple clouds, through terraforms providers, for example.

Doing the same infra, across multiple clouds, is a different thing, it's not a Hashi thing.


> Doing the same infra, across multiple clouds, is a different thing, it's not a Hashi thing.

Not sure about this. Here is what official site says:

"Provisioning infrastructure across multiple clouds increases fault tolerance, allowing for more graceful recovery from cloud provider outages."

https://www.terraform.io/use-cases/multi-cloud-deployment


Terraform supports different clouds but with completely different syntax. The marketing on the website just informs you what you can use any cloud you want. The message is against other tools such as cloud formation (AWS only) or GCM (Google only), ARM (Azure only) and so on.

To give you an analogy it would be like Firefox saying that they are "multi-OS" meaning that you can install Firefox if you have Windows and you can install Firefox if you have Linux. It doesn't mean that you must/should have Linux and Windows at the same time as a user.


Strictly incorrect. The syntax is the same for every cloud - either HCL2 or JSON. What you are calling syntax is actually the resource model. Every project which claims to bridge this resource model ends up implementing a lowest common denominator time sink which doesn’t stand up to basic scrutiny.

Your analogy also does not demonstrate whatever it is you seem to think it does - Firefox is indeed multi-OS.


It is ok. Parent already said that that they mean something different with "multi-cloud" https://news.ycombinator.com/item?id=38649796


The definition of multi-cloud does not matter when talking about what syntax is.


> The marketing on the website just informs you what you can use any cloud you want.

Disagree. The message says “fault tolerance”, meaning you deploy same things in different clouds. Which brings back to my original post - TF does not help with this, as jen20 said - very different resource models.


The brain is analog and chemical, AI will be digital and silicon. We have no idea how the map from one to the other.


> The brain is analog and chemical, AI will be digital and silicon.

Says who?

Sure, if you assume that “AGI is just scaling up GPT”, it will be digital and silicon. But that’s a big assumption.

For all we know, AGI will only ever, if it exists, be analog and chemical.

> We have no idea how the map from one to the other.

Plus, even if we had an easy one-to-one mapping function between them, we don’t understand the source well enough to do the mapping.


It does not need to be, but today the computers we use are overwhelmingly based on silicon. Also OP mentioned AI, not AGI.


An analog-to-digital and digital-to-analog converter is less than a cup of coffee in some places [1].

[1] https://protosupplies.com/product/pcf8591-a-d-and-d-a-conver...


Oh god so my background is CE/ECE stuff and you managed to trigger me. I don't want to be rude... just bluntly saying you triggered me. Doing something really small for A/D D/A with 8bit and not worrying much about resolution and data loss is one thing. For something massive scale the problem is a lot less trivial and a lot more mathematical.


Haha, sorry, was more of a tongue-in-cheek reply to "We have no idea how [to] map from [analog] to [digital]".


Is the Quantum computer hypothesis dead?


I don't see how quantum computers are relevant? We can't build them, and there certainly isn't any interesting quantum computation in the brain.


What do you mean we can’t build them?

To your second point, we have little to no ability to understand yet what quantum effects may or may not be active in brain/consciousness function. We certainly can’t exclude the possibility.


I mean lots of people have tried to build quantum computers, and so far no-one has succeeded in anything that's describable as a "computer", instead of "half a dozen logic gates". Perhaps in the future.

We can fairly well exclude the possibility of interesting quantum effects in human consciousness, because the human brain is a hot, dense environment that might as well have been literally designed to eliminate the possibility. It's the exact opposite of how you want a quantum computer to be built.

Which doesn't mean there aren't plenty of quantum effects involved in the molecular physics, but that isn't what is normally meant by 'quantum computer'. Transistors would also meet that definition.


I do not agree with either of your assertions:

1) That 433 qubits does not make a computer and is instead “a half dozen logic gates.” I agree a half dozen logic gates is not a computer. 433 qubits is not comparable in terms of information capacity or processing capacity to a half dozen logic gates. This number is also publicly doubling annually now — I would bet the systems we don’t know about are more complex. Importantly, a computer in this context is not something you would attach a monitor to — it is just an electronic device for storing and processing data.

2) That we have any good idea of the limits of how biological systems might be influenced by quantum effects within specific temperature ranges. You certainly wouldn’t construct a human brain to interface with quantum effects given the present state of our knowledge in constructing these kinds of systems. But then we can’t even construct a self-replicating cell yet, nevermind a brain. It’s hard to imagine we understand the limits at work there.


I believe this is still taught in motorsport cabling courses


I have a Samsung S34J55x, and a DisplayPort 4 port KVM from Lindy. I have 3 laptops connected, all different makes (lenovo, dell, ms), and they generally take 1-2 sec to switch between machines. Maybe the key is the video interface eg HDMI vs DVI vs DP, be an interesting experiment.


That's not a problem caused by git, it's a problem caused by undisciplined developers.


Great article, but I was also struck how much the picture looks like a 2020 COVID office - drawn 4 years ago !!!


I couldn't disagree more. He wants privacy, I don't, at least as far as job opportunities.


I love videos, as long as the presenter is organised and speaks clearly


work out what your enemy considers an uncrossable boundary - then cross it - gets them every time.

source - Art of War, probably...


first useful programs were knitting mills and carpet machines around 1800 - there was no conditional logic, just a long loop of effectively what were just print statements. See jacquard looms on wikipedia - https://en.wikipedia.org/wiki/Jacquard_loom#/media/File:Jacq...

first real program is often attributed to Countess Ada Lovelace, in describing operations of the Babbage machine in 1843. She even had a bug in her "code". https://en.wikipedia.org/wiki/Ada_Lovelace#First_computer_pr...


"there was no conditional logic, just a long loop of effectively what were just print statements"

So what you're saying is, Jacquard beat Adobe to PDF by nearly 200 years?


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: