Hacker Newsnew | past | comments | ask | show | jobs | submit | crabbone's commentslogin

Absolutely.

This is also a typical approach from the chefs I know: they don't care about precision in most recipes (eg. dishes like soups, or pasta, or salads...), but then sometimes there are dishes where precision is absolutely crucial, and baking is one place where precision is really important.

With sourdough, if you don't measure, you may still get good results, but you will have to babysit the dough and try to figure out when it's ready by checking frequently. Some people can afford it time-wise, and to some this would be prohibitively inconvenient.


The big events that shatter everything to smithereens aren't that common or really dangerous: most of the time you can lose something, revert and move on from such an event.

The real unmitigated danger of unchecked push to production is the velocity with which this generates technical debt. Shipping something implicitly promises the user that that feature will live on for some time, and that removal will be gradual and may require substitute or compensation. So, if you keep shipping half-baked product over and over, you'll be drowning in features that you wish you never shipped, and your support team will be overloaded, and, eventually, the product will become such a mess that developing it further will become too expensive or just too difficult, and then you'll have to spend a lot of money and time doing it all over... and it's also possible you won't have that much money and time.


Yeah... The article doesn't even attempt to answer the question in its title. It's just a watered down Intro to Mathematics 101.


Oh, you opened a can of worms... In terms of user experience Android is garbage. It forces on you features you cannot remove unless you break into the system (which is kinda illegal or, at a minimum, voids your warranty).

Stuff like "do not disturb" that turns on accidentally and makes me miss calls, and is impossible to remove. It's impossible to remove a bunch of trash from the lock screen, and with some workarounds sometimes only the picture is removed, but it stays interactive or affects other widgets, like the audio player, for instance. Lockscreen randomly trying to dial random numbers, especially if I don't answer an incoming call. Also, taking screenshots randomly, so after almost every run I have to spend some time deleting these screenshots.

Now, when it comes to the subject in OP, it's not really about Android, it's about Google's policies around developers and app store. The whole idea behind Android is very similar to MS Windows: oppress the user because the system provider "knows better". Make choices on user's behalf, prevent users doing from useful things jut to blanket "secure" them from some imaginary threat. Manipulate users into doing a thing that's harmful for them, but beneficial for the system provider.

So, the app store managed by Google is one example of such policies. Google doesn't have the best interest of the user in mind. They are maliciously complying with regulations that want them not to abuse their users. They check the applications submitted to the app store, but they check them for the wrong things. Just to say they did.

I ended up using an FTP server app from F-Droid and a file manager from F-Droid because the stuff that was available for the same functionality found in app store is some atrocious predatory trash. It doesn't matter if I can afford to buy an app. Whatever I tried was just garbage. Once you get used to freedom and the approach of free software after you've spent some time with eg. Linux, using Android will make your blood boil because of how hostile both the system and the programs written for it are.


The way I understand parent is that such a type would be too broad.

The bigger problem is that the type system expressed through hints in Python is not the type system Python is actually using. It's not even an approximation. You can express in the hint type system things that are nonsense in Python and write Python that is nonsense in the type system implied by hints.

The type system introduced through typing package and the hints is a tribute to the stupid fashion. But, also, there is no syntax and no formal definitions to describe Python's actual type system. Nor do I think it's a very good system, not to the point that it would be useful to formalize and study.

In Russian, there's an expression "like a saddle on a cow", I'm not sure what the equivalent in English would be. This describes a situation where someone is desperately trying to add a desirable feature to an exiting product that ultimately is not compatible with such a feature. This, in my mind, is the best description of the relationship between Python's actual type system and the one from typing package.


> In Russian, there's an expression "like a saddle on a cow", I'm not sure what the equivalent in English would be

“To fit a square peg into a round hole”


Close but not the same. In Russian, the expression implies an "upgrade", a failed attempt at improving something that either doesn't require improvement or cannot be improved in this particular way. This would be a typical example of how it's used: "I'm going to be a welder, I need this bachelor's degree like a saddle on a cow!".


"Lipstick on a pig"? Although that's quite more combative than the Russian phrase.


Yeah... this seems like it would fit the bill nicely. At least, this is the way I'd translate it if I had to. Just didn't think about it.


Same nonsense repeated over and over again... There aren't dynamic languages. It's not a thing. The static types aren't what you think they are... You just don't know what you are saying and your conclusion is just a word salad.

What happened to Python is that it used to be a "cool" language, whose community liked to make fun of Java for their obsession with red-taping, which included the love for specifying unnecessary restrictions everywhere. Well, just like you'd expect from a poorly functioning government office.

But then everyone wanted to be cool, and Python was adopted by the programming analogue of the government bureaucrats: large corporations which treat programming as a bureaucratic mill. They don't want fun or creativity or one-of bespoke solutions. They want an industrial process that works on as large a scale as possible, to employ thousands of worst quality programmers, but still reliably produce slop.

And incrementally, Python was made into Java. Because, really, Java is great for producing slop on an industrial scale. But the "cool" factor was important to attract talent because there used to be a shortage, so, now you have Python that was remade to be a Java. People who didn't enjoy Java left Python over a decade ago. So that Python today has nothing in common with what it was when it was "cool". It's still a worse Java than Java, but people don't like to admit defeat, and... well, there's also the sunk cost fallacy: so much effort was already spent at making Python into a Java, that it seems like a good idea to waste even more effort to try to make it a better Java.


Yeah, this is the lens through which I view it. It's a sort of colonization that happens, when corporations realize a language is fit for plunder. They start funding it, then they want their people on the standards boards, then suddenly the direction of the language is matched very nicely to their product roadmap. Meanwhile, all the people who used to make the language what it was are bought or pushed out, and the community becomes something else entirely.


We kind of already have groups in Gnus... I even messaged one group, like twice in my life.


Sort of. There's Org for Vim users :)


The problem is that DOM is absolutely inadequate for describing page layout, and even less so for Web applications. Incremental changes to DOM were meant to make it more suitable for this goal, but having inherently bad foundation didn't exactly help.

I believe that some sort of a constraints language would've been a lot better at describing page layout. And stuff like Web applications simply shouldn't exist. These should be applications that use native UI toolkits while relying on Internet for moving data, not presentation.

Users are usually unhappy with Web applications because of the way browsers restrict useful functionality and programmers struggle with workarounds and bloat of browsers to accomplish simple things.


So instead of writing an app once I can write it ten times once for each native platform.

I AM SOLD


Again and again, the most important question is "why?" not "how?". Python isn't made to be fast. If you wanted a language that can go fast, you needed to build it into the language from the start: give developers tools to manage memory layout, give developers tools to manage execution flow, hint the compiler about situations that present potential for optimization, restrict dispatch and polymorphism, restrict semantics to fewer interpretations.

Python has none of that. It's a hyper-bloated language with extremely poor design choices all around. Many ways of doing the same thing, many ways of doing stupid things, no way of communicating programmer's intention to the compiler... So why even bother? Why not use a language that's designed by a sensible designer for this specific purpose?

The news about performance improvements in Python just sound to me like spending useful resources on useless goals. We aren't going forward by making Python slightly faster and slightly more bloated, we just make this bad language even harder to get rid of.


The frustrating thing is that the math and AI support in the python ecosystem is arguably the best. These happen to also be topics where performance is critical and where you want things to be tight.

c++ has great support too but often isn't usable in communities involving researchers and juniors because it's too hard for them. Startup costs are also much higher.

Ans so you're often stuck with python.

We desperately need good math/AI support in faster languages than python but which are easier than c++. c#? Java?


It is kind of ironic that this is now the Zeitgeist, while in the 1990's my university used to teach C++ to first year students, and I learned it as high school student with Turbo C++ 1.0 for MS-DOS, about a year after it was made commercially available, later acquiring Turbo C++ 1.5 for Windows 3.1 with student discount.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: