Hacker News new | past | comments | ask | show | jobs | submit | flywheel's comments login

Whoever the first developer was that used "deprecated" got it kind of wrong, the word should have been "depreciated".

Deprecate: "express disapproval of."

Depreciate: "diminish in value over a period of time."

I kind of cringe when other developers say "deprecated".

Edit: Versioning and not removing APIs is kind of the way to go, so you don't break client apps that possibly can't be updated easily or at all. "Depreciated" is a far better word to use with a far better outcome. AWS versions their APIs, they don't remove old ones. "I disapprove of using this API and we're taking it away at some random date" vs "this isn't the latest API, use the current one for new development" seems like a pretty stark difference in thinking to me. YMMV.


Nope.

It is deprecated -- it's use is disapproved of, you should stop using it. In the future it will go away but for now it works, so you can use it, but its use is discouraged.

Depreciated doesn't make any sense -- the value of the deprecated API does not diminish over time. It works, until it stops working. It's on or off. It doesn't work less and less every month or anything. It currently still works completely, but is deprecated -- that is, discouraged. At some point in the future, it will stop working, completely.

the rest of us don't just kind of but REALLY cringe when people say "depreciate" when they mean "deprecate". They are different words, "deprecated" is the right one, it is intentional, it is the word.

Sorry, you are the one using the wrong word.


Yeah, nope yourself. It seems like a lot of people aren't really thinking this through very much.

And that is absolutely the wrong way to approach API development. An API that is being sun-setted should never be removed, because older clients could still use it but sometimes can't be upgraded to newer clients. Removing a v1 API breaks those clients and it's a shitty thing to do to users. Yeah, people should be building NEW things with it, but there's no reason to look at the v1 API with "disgust" as "deprecated" implies - It's simply an older version that should remain functional, if your system is worth half a shit. AWS doesn't terminate older API versions, they just create new versions. Or you can be like Facebook and "deprecate" stuff and just shut it down before your official shutdown date, or not give any notice at all - that's REALLY a fun culture to work in, I guess, for them. "deprecated" is a really negative word, and doesn't even really translate to anything good in terms of software development. It's my opinion that "depreciated" is a far better word and far better outcome when used in software development instead of "deprecated". YMMV.


OK, I understand you have an opinion that API design should be done in a certain way (by the way, by "API" I meant like method signatures, not network API, but it could be either).

And I understand you disapprove of the word "deprecated" being used to refer to API that is discouraged, usually because it will be no longer supported/going away in the future.

But that doesn't change the history of the word. The word "deprecated" is what engineers have been using, intentionally, for several decades.

"Depreciated" is a mistaken variation. Even if you think "deprecated" has unfortunate connotations, it still doesn't make "depreciated" right. "Depreciated", as you said, means losing value over time. That is, 10% a year or something. Deprecated API does not "lose value over time".

The word "deprecated" has historically been used to mean that certain API (again, likely a method or function, I don't mean network api specifically) is now discouraged, it's use is disapproved of. Usually becuase it will be going away in the future. Arguments about whether this is the right way to do API change are entirely separate to this historical and current usage, where API change often IS done this way, and it's what the word is used for.

You can have opinions of how you'd like to people to handle API change over time, but that doesn't chagne the fact that "deprecated" is the word engineers have meant to use for decades. If you'd like to advocate for a differnet word and/or different practice you can -- but all "depreciated" has going for it is it sounds confusingly similar to "deprecated", it is not the word you are looking for.

> Not to be confused with Depreciation.

> In several fields, deprecation is the discouragement of use of some terminology, feature, design, or practice, typically because it has been superseded or is no longer considered efficient or safe, without completely removing it or prohibiting its use.

> It can also imply that a feature, design, or practice will be removed or discontinued entirely in the future

https://en.wikipedia.org/wiki/Deprecation

> In accountancy, depreciation refers to two aspects of the same concept: first, the actual decrease of fair value of an asset, such as the decrease in value of factory equipment each year as it is used and wears, and second, the allocation in accounting statements of the original cost of the assets to periods in which the assets are used (depreciation with the matching principle)

https://en.wikipedia.org/wiki/Depreciation

> In economics, depreciation is the gradual decrease in the economic value of the capital stock of a firm, nation or other entity, either through physical depreciation, obsolescence or changes in the demand for the services of the capital in question.

https://en.wikipedia.org/wiki/Depreciation_(economics)

Depreciation has nothing to do with what we're talking about, it's not the right word. Deprecation is the word that has been used for decades for API whose use is discouraged, often because it will not be supported in the future. You can argue that a new term is needed, but that's your argument not a historical usage, and there's no reason you need to limit yourselves to words that sound confusingly similar to "deprecation".


You got a little repetitive there, but yes, I agree: "deprecated" means stop using this because it's going away.


I sure did!

Technically, it doesn't have to be because it's going away, although that is common. I think it always means there's a better recommended way to do the thing, but sometimes the deprecated way doesn't go away.


Deprecated is a word. No negativity implied, but equating "deprecated" to "depreciated" is insular. It is okay to be wrong, I have been wrong (George Foreman Grill dissenter..."it drips fat in the front? Gross."). I am not sure what your native language is, and I wish I did so we could communicate, but it is like calling a "warning" as "decay" in English (US).

Typo corrected.


Isn't deprecated actually correct here?

It means the feature still works, but will be removed in the future or is no longer supported. There also be may a new implementation of it that the developer would like you to use, hence the warning that it's deprecated.

Depreciation implies a rate of change over time, which isn't the case. Today we deprecate feature X, and in two years we plan remove it. It never depreciates.


But "express disapproval of" is exactly the meaning intended when we say that a feature is deprecated. It signifies that it is best practice not to use it.


If it’s given as a warning then yes, e.g. the dplyr package in R sometimes outputs “feature xyz is deprecated and will be removed in version x.x”.

Often though it’s used when the feature is already removed, i.e., it’s not only best practice not to use it, but also impossible with that version.


In this case, depreciated is incorrect. Removal has already happened, the "period of time" is already over.


Removing APIs is not a great practice though. Look at AWS, they version their APIs, they don't just remove them, and removing them should be unnecessary if your underlying tech isn't brittle and badly written. "Depreciated" is a far better term to use, with a far better outcome in my opinion. Companies that remove old versions of APIs and break existing client apps (that possibly can't be udpated) really suck.


1) Whether you agree with the practice doesn't affect the terminology used. People remove APIs. Before doing that, they deprecate them for a period to advise people to move off of them.

2) If you were to always maintain backward compatibility, how is "depreciated" in any way an accurate term? If the old API continues to work indefinitely, its value stays the same.


I don't think these two are incompatible?

If APIv3 has a `/foo` endpoint that is deprecated, usually I take that to mean that the developers discourage its use, and likely plan to remove it in a future version (say, APIv4 or APIv5). `/foo` will never be removed from APIv3, because that would be a breaking change, and so if I'm willing to stay on v3 forever, that's fine, but in the (likely) event I will want to take advantage of new features at some point in the future, I'm doing myself a disservice by using /foo because it will make the migration harder.

There is at least one case where I think "deprecated" is clearly, inarguably, the right word: when the developer wants to remove a part of an API (say, because it is a large maintenance burden), but it's also committed to stability, so they won't remove that api until some acceptably small number of users are using it.


Right, you version APIs and have a policy of deprecated APIs being removed in the next version. Or you can just copy and paste a comment calling people shitheads for politely disagreeing with you. Your call.


The “public APIs form an immutable, irrevocable contract” argument means that an api layer with these tenants is always going to be a source of technical debt. Get it right the first time or fight an ever growing compatibility matainance war - even when your instrumentation is saying that old apis aren’t being used, just published, seems like a footgun


This is a jaw-droppingly arrogant attitude. You're trying to justify your own incorrect usage by asserting that the person who coined the term decades ago "got it kind of wrong"? And you cringe when others get it right?

"Depreciated" is absolutely the wrong term, because it implies that the value is less, when the intent is to communicate "this is still fully functional, but you are warned away from it because it is targeted for future removal." Deprecated.


Feels like I often see it used to retire APIs that are now understood to be unsafe, insecure, or otherwise a bad practice for some reason. It gets replaced with an API that does not inherently have that problem, and the old one is in deprecated. it feels like "expressing disapproval of" is the right definition in that case. It's only there for a migration period to happen more gracefully, but its continued use is frowned upon, and not just because it will eventually be removed.


If you deconstruct the original latin that forms the word, it's literal translation is something like "ask to go away"

de == away

prek == ask


In French, we have an equivalent for "depreciate": déprécier. But we don't have a close relative for "deprecate" (which translates to "désapprouver, dénigrer" and would never be used for an API).

We tend to use the terms "déprécié" (~depreciated) or "déprécaté" (~deprecated but not valid French).

On the other hand, "deprecate" seems to also translate to "mark as obsolete" according to https://www.wordreference.com/enfr/deprecate

I guess both terms make sense but I would keep using "deprecated".


The first definition is intended and more fitting for the usages of "deprecated" I've encountered.


I used to always use "depreciated" until I was embarrassingly corrected one day :P


To be frank, grandparent sounds like someone who was corrected one day, and rather than learn something and move on, dug in and developed a detailed justification for why the rest of the world was mistaken so he can cringe about their ignorance.


You could not be more wrong if you practiced every day https://www.etymonline.com/word/deprecate#etymonline_v_29603


I appreciate the etymonline reference but I'm afraid you've been breaking the site guidelines quite badly in repeated comments such as this one and https://news.ycombinator.com/item?id=24101885. We ban accounts that do that because we're trying for a bit better than internet-default outcomes on HN. Would you mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the spirit of this site more to heart? We'd be grateful.


Prediction: Using the methods of "dorking", this is the only page on the internet among 10 million+ results that is calling this "dorking".


I hope it doesn’t catch on since it makes me die a little inside. It’s a very Reddit-type word though. I can easily imagine it being used by non-technical folk and tech journalists.



That's web scraping 101


Thanks for posting this - I've been down that road. I regularly have to parse about 20GB of JSON split up into 8MB JSON files - tried this library but was sad that it didn't help. I'm currently using threading in nodejs and that has helped quite a bit though, parsing up to 8 of those files at a time has given me quite a performance boost - but I always want to do it faster. Switching to using C just isn't really a viable option though.


You can write a small program in C just for the parsing part. Then call it from Node.


Amazing how nerds can be so focused on if it was 1 kiloton, 2 kilotons, or 3 kilotons - and they are arguing about it like it really matters.


A few hours before you made this comment, you made a nit-picky objection on a different HN post about Intel i5 versus Intel i7. Someone responded to you saying that i5 and i7 are marketing terms that most consumers interpret as an indicator of performance. You replied, "I guess this isn't really 'hacker news' then, because I would expect just about any 'hacker' would know the difference."

Then, you came to this HN post and described a bunch of people as "nerds" for trying to interpret the magnitude of the blast by observing what was captured on video.

The contrast between these two simultaneously held attitudes is so bizarre that I am chuckling to myself as I try to grasp your underlying thought process.

(You could easily have switched your approach to the two threads, criticizing the Intel i5-vs-i7 nerds while doing some napkin math to come up with your own estimate of the Beirut explosion in TNT terms.)


This whole thread is full of debate about diverse details about this event and things related to it, so yeah, it's perfectly normal to debate claims like a headline's subject, Nobody thinks it literally matters too much, They do it out of intellectual curiosity.. It's not just a "nerd" thing.


Your comment seems a bit disingenuous - how much time passed between when you got the i7 and when you got the i5? Today's i5's can be faster than long-ago's i7's but todays i7's are still far faster than today's i5's.


Their comment isn't disingenuous - that's just how a regular, normal, non hardware nerd would most likely interpret Intel's branding. Bigger number = faster, as far as consumers are concerned. If you understand the i3/i5/i7/i9 tiers, you're in a small minority.


I guess this isn't really "hacker news" then, because I would expect just about any "hacker" would know the difference.


Maybe you missed the part where the program tells the user it's running in "CPU Mode" and may be slower, "Try running on a GPU" is another message that would make it clear.

So many comments here make it seem like we're dealing with people with a negative IQ.

Just show an appropriate message and let the user run the software. It's GREAT that there is a simple CPU fallback and it's enabled by default.

It's really an edge-case where a user would want to run the code and have it fail if there isn't a GPU - that is mostly an edge case of 1 user, the developer, except in very specialized situations where nobody would want to run it without a GPU, again an edge-case.


>Prediction: someone deploys this and doesn't notice it's incredibly slow for mysterious reasons, and then spends several hours to figure out that the right version of CUDA wasn't installed.

Why not just print out a message "Running in CPU mode" or "Running in GPU mode". There's no reason it has to be a mystery.

The comments here tell me that programmers have a narrow idea of what other programmers should/would/could do with their software.


>This is a bad thing, not a happy thing to be advertised. Software fallback was of the most frustrating parts of working with OpenGL circa 2010.

>If you're making a hardware accelerated library, stop trying to make it "gracefully" fail. Just fail! That way people can address the root problem.

No. I want my code to run locally sometimes (possibly with a GPU), and sometimes I run the same code across hundreds of EC2 instances or just 1 instance, and I have good reasons for that. Sometimes a machine doesn't have a GPU, but I still want that code to run on it.

The "bad thing" is imagining you know every possible use of javascript, and then telling other people about how they should be running javascript (or any language).


My wife is on TikTok for at least an hour a night if not more - I've never once seen or heard anything pro-china or anything that would otherwise influence her opinion about China. Not once. It's all content from US social media whores.


Content doesn’t need to be overtly “pro China” to influence individuals towards goals ultimately desirable by the PRC. Information operations can have a variety of goals are a designed to be unnoticeable, unless otherwise intended to be visible (e.g., making it apparent to GRU cyber actors that we had access to their systems during the mid-term elections).

Edit: Here’s a link to the mid-terms operation by US Cyber Com: https://sofrep.com/news/cyber-warfare-us-cyber-command-strik...


You are focused on outputs what about inputs?


What inputs? You mean the fact that she follows cute dog videos? I'm sure China is going to use that against her somehow /s


Yes her data, meta data about your network, devices etc. Just because you arent a target based on that data, doesnt mean someone isnt.


Her "data" is mostly liking cute dog videos. China can have that data for all I care. Meta data about my network, devices? If China really wants to know that I'm on Spectrum internet, and have a few other devices connected, sure, go ahead. I monitor my network and haven't seen any suspicious traffic that would worry me.


They dont want your data, but your data being worthless isnt an argument. You arent the target.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: