I used to live in an apartment built in 1910 with curved window panes. While not common they must not have been too difficult to fabricate if needed, even a century ago.
Curved glass is easier in the historical method, because flat glass was made by blowing a large bottle and flattening a piece of it against something before it set (which is why you could only get small flat panes). So to get a curve you'd just shape it against something the right curve.
I'm not sure how you get curved glass today. Possibly you have to start with a flat sheet and heat it until it can be bent.
That doesn’t follow. Code can contain extremely sensitive and/or valuable IP independent of the value of the code as an asset. Reduction to practice frequently fails to produce usable software.
That's why I didn't just include code; if you produced valuable design docs as part of your work, that was part of your research too. I'm generally skeptical of the societal utility to offering any protections or special treatment for trade secrets though (the entire point of patents/copyright is to incentivize people to share these things; it's insane to also protect their secrecy), so that no doubt affects my thinking. If you want the deduction for having spent money on R&D that you didn't think was valuable, prove it by giving it up. If it's entangled in other secrets you don't want to share, you get no deduction. Seems fair to me.
That is making assumptions that aren’t based in reality. Serious software R&D stopped relying on patents and copyrights years ago because they are effectively non-enforceable in many cases.
A significant percentage of algorithm and foundational computer science R&D in software is now protected exclusively via trade secrets. There are no other practical options. This wasn’t always the case but all other forms of protection have steadily eroded over the last couple decades.
Weaponizing the tax code because you have an ideological aversion to trade secrets doesn’t seem fair to me.
It's not really "weaponizing the tax code because of an ideological aversion"; it's more:
* It makes sense to tax capital assets as such.
* If companies do R&D and think the results are valuable enough to be kept secret, then obviously they're an asset.
* Depreciation is because real-world assets actually require ongoing maintenance or become worthless over time, but information does not.
* Finite-term IP grants (e.g. copyrights/patents) do become worthless over time, so a depreciation schedule makes sense.
* Trade secrets never expire, so it doesn't make sense to depreciate them. If they never get out, they remain an asset forever. So their development shouldn't be deductible. If they do get out, the company could release all of their (now presumably useless) info on it then for the deduction from their development.
The point about finding trade secrets to be dubious is that it seems natural to tax them as an everlasting capital asset (since that's what they are), and I don't see why we wouldn't do that since society doesn't eventually get the benefit of that knowledge, so incentivizing it runs counter to the purpose of IP law. Why would a knowledge economy provide a tax deduction for developing knowledge we don't eventually get?
People greatly overestimate the amount of material cheating that happens, especially among large companies and the wealthy. I used to work for a Federal audit organization and almost all of the recoveries had a root cause in sloppy compliance and record-keeping practices rather than intentional malfeasance. It is broadly recognized as optimal that the recovered money should be several-fold the direct costs spent to recover it because this activity incurs a lot of non-obvious indirect costs. It is a variation on the principle that the optimum amount of fraud is non-zero.
Most of the blatant tax fraud is much lower down the economic ladder because below a certain threshold recovery doesn’t justify the cost and people know this. The amount you can get away with is far below the threshold where it would be worth the risk for wealthy parties. The best ROI for auditors in many of these cases is to make regular object lessons at random to discourage it rather than systematically prosecute it.
AFAIK, the increased spending at the IRS did not lead to concomitant offsetting recoveries. This is a predictable outcome, the amount of enforcement activity has been pretty finely tuned for decades to optimize ROI. Most of the recoveries come from changing focuses on compliance to areas that haven’t seen much enforcement activity in many years. Fighting entropy basically.
If you assume that most large recoveries are from sloppiness rather than systematic tax fraud, it changes what is going to be an effective strategy.
>AFAIK, the increased spending at the IRS did not lead to concomitant offsetting recoveries. This is a predictable outcome, the amount of enforcement activity has been pretty finely tuned for decades to optimize ROI. Most of the recoveries come from changing focuses on compliance to areas that haven’t seen much enforcement activity in many years. Fighting entropy basically.
These are studies designed to show positive results, and are susceptible to the criticism the parent identified.
IRS enforcement has diminishing returns because the IRS starts with the small minority of people who are very obviously cheating on their taxes. Those people get audited and the IRS very easily recovers money from them. If you want to audit more people than that, you have to audit people who are less likely to be cheating. The more people you want to audit, the lower the collections rate gets.
But if you're averaging in the recovery rate from the people who are so obviously cheating, you can get quite far down the road past a marginal benefit before the average becomes a negative number.
Meanwhile, even that isn't considering the indirect costs. The IRS spends $1 and recovers $2, but audits are much cheaper than the IRS than they are for taxpayers. So the IRS spends $1 and the taxpayers (many of whom did nothing wrong, because we're talking about averages here) have to pay $5, in order for the IRS to recover $2. That's quite bad -- $6 is being spent in order to recover $2, but it's being reported as a $1 net gain.
And it's worse than that, because those $6 aren't just money, it's actual spending -- human labor hours that couldn't be allocated to something else -- so what you're losing isn't the cost of that labor, it's the value of that labor. Someone was being paid $1 to create $2 in value but now instead of doing that they have to spend that time on an audit, so the $6 in cost is actually $12 in lost value.
Not accounting for things like this makes it seem like we should be spending a lot more resources on something with diminishing returns and large hidden costs.
The criticism the parent identified has almost nothing to do with these studies. It was that there is an equilibrium point where enforcement is counterproductive, but it did not identify anything about where that is or how that point relates to where we are.
At some point it gets to that level, but all of these studies show it is extremely far from that at present. This is also not at all what the IRS has been advocating going after.
The extremely cheap (for the IRS) audits you are talking about are the ones they have been doing for years because they can afford to. The tax situations are simple so don't require significant resources to audit. These are also the situations the original comment was talking about. The IRS and others have been advocating for years for the resource to go after actual tax cheats of wealthy individuals and corporations, whose tax situations are (intentionally) so complex that it is a serious investment to audit. Once you do audit them however, their tax dodging decreases for years into the future. This costs the employees and financial advisors dedicated to dodging taxes money.
The "hidden costs" you are so concerned about here, in many cases cannot be argued to exist. The people that would spend time defending violators are otherwise fully employed doing the opposite... coming up with ways to get around the taxes their employers or customers are supposed to be paying. Instead of costing $2, that comes out as getting yet another $2 out of that audit by distracting a societal parasite.
> The criticism the parent identified has almost nothing to do with these studies.
The criticism the parent identified is that the cost to the IRS is not the total cost to the public (i.e. innocent taxpayers being audited despite making only honest mistakes or having done nothing wrong at all), which is exactly a problem with these studies. To know where the equilibrium point is, you have to take into account these other costs, and the studies fail to do that.
> The IRS and others have been advocating for years for the resource to go after actual tax cheats of wealthy individuals and corporations, whose tax situations are (intentionally) so complex that it is a serious investment to audit.
What's really going on here is that those are the taxpayers it isn't as cost effective to audit because they have sophisticated lawyers, so they're much less likely to be violating the law. They're doing something which is complicated and then paying very little in taxes, but the complicated thing they were doing is legal so you can't get anything from auditing them. Meanwhile auditing them costs a lot because it's so complicated, so the ROI of doing it is pretty bad.
In particular it's worse than the ROI of auditing other taxpayers who can't afford such expensive lawyers and therefore are more likely to have made a mistake that allows the IRS to collect. But auditing those people makes the IRS much less sympathetic, because those people aren't the billionaires and the money the IRS collects is mostly a result of honest mistakes.
> Once you do audit them however, their tax dodging decreases for years into the future.
The assumption is that they were doing something unlawful to begin with, and then you're talking about the non-billionaires again.
Moreover, what really happens is that the people who made mistakes learn to hire tax lawyers. And then if you audit them again it comes up clean, but that doesn't mean they're paying more in taxes, because tax lawyers are pros at finding legal ways to avoid taxes, so what you've really done is encourage them to hire the people whose primary job it is to minimize tax revenue.
> The people that would spend time defending violators are otherwise fully employed doing the opposite... coming up with ways to get around the taxes their employers or customers are supposed to be paying. Instead of costing $2, that comes out as getting yet another $2 out of that audit by distracting a societal parasite.
It is definitely not the case that the number of tax lawyers and accountants employed is unrelated to the number of audits the IRS does. The more they do, the more business there is for those professions and the more people enter them. These are people who could have been doing something else and, moreover, people who consumed the resources that someone else could have used to do something better.
> Who pays taxes when it's well known that the IRS doesn't audit and follow up on tax cheats?
But they do. They always have. The question is, once they've done that, should then they proceed to audit an even larger number of mostly innocent people, because a small percentage of them did something wrong and finding that small percentage would cover the costs of the IRS, but not any of those other innocent people?
> Especially, if all it takes to further dissuade them is engineering complex wealth structures and keeping tax lawyers on retainer.
This is an entirely different problem. The ones with sophisticated lawyers aren't actually violating the tax code. The problem there is that the tax code is so complicated and poorly considered that fancy lawyers can find ways to avoid taxes without violating the law.
They already have that. The problem is, in order to find someone who is actually cheating, they have to audit a lot of innocent people, and who is covering the cost of those audits when they don't find anything?
> the amount of enforcement activity has been pretty finely tuned for decades to optimize ROI
And then cut by 20% by the current adminstration [0].
I'd phrase the question of auditing lower income filers vs higher income filers differently -- do you think people with higher incomes should feel safer about cheating on their taxes?
Because average recoveries do scale with income [1]; unsurprisingly, it seems wealthy people commit tax fraud too [2].
While catching the low-hanging fruit (and therefore better ROI) is one goal, it needs to be balanced with ensuring there are similar levels of compliance (or penalties where it's lacking) in higher income payers.
This comment (currently downvoted to hard-to-read grey-on-grey) is an excellent example why you should always enable "showdead" in your profile, use user CSS to make downvoted comments readable again (e.g., https://news.ycombinator.com/item?id=41514726), and browse https://news.ycombinator.com/active instead of the homepage once in a while.
The vast majority of software barely qualifies as an asset, since it has no intrinsic value. It isn’t like a tractor or a factory, which has a non-zero market-clearing price.
A one-off shell script has an asset value of zero after its single use but still counts as a long-term capital asset for tax purposes.
It affects any software developers worldwide that work for US companies. The specific tax law is even worse for foreign developers, since it requires amortization of non-American software developer expenses over 15 years instead of 5 years. How much code is written that retains its value for 15 years?
"In the United States, to help spawn innovation as part of the Economic Recovery and Tax Act of 1981, the Research & Experimentation Tax Credit was introduced. Although it was initially supposed to last three years as a specific incentive to encourage companies to invest in R&D, Congress recognized its value in helping businesses create more products and services.
However, it was quickly realized that this tax code made calculations for R&D complicated, especially for small businesses, which led the government to create other iterations of tax codes in order to help clarify the situation. However, not until 2017 and the enactment of Section 174 of the TCJA has there been such a comprehensive change to R&D accounting.
Indeed, before the TCJA’s enactment, businesses deducted the total amount of R&D expenditures as an expense in the taxable year. Beginning in 2022, all costs related to R&D must now be amortized over five years for US-based companies or 15 years for non-US companies."
I'm struggling to understand why we think R&D expenditure - including software development - should not be amortised?
People think R&D expenditure shouldn't be amoritized because it hurts startups.
For example, if you're a first-year startup and you make a software product with $1 million in revenue but pay $900,000 in software dev salaries, and the tax rate is 25%, without amoritization you pay (1,000,000-900,000)*0.25 = $25,000 in tax and make a profit, but with 5-year amoritization you pay (1,000,000-900,000/5)*0.25 = $205,000 in tax and take a loss.
But since established companies aren't affected as much, they are advantaged by the amoritization rule.
The irony is that the US routinely uses much more capable software for almost identical purposes in domains like battle space management. It isn’t like the US doesn’t have this software, more that the FAA doesn’t consider anything derivative of that tech as an option.
My original response was sharper than I intended; I am aware of both the air-defense and ATC systems for both the civilian and military use (so worked in each cell of that mini-2x2 table). The military is much more tolerant of risks and, even if implementing them would lead to an increase in safety and convenience using military systems for civilian ATC will likely cause all sorts of problems due to differences in training, planning, etc.
The upgrades have been funded for decades. It is an execution issue, not a money issue. Many other parts of the Federal government are in the same condition: software upgrades that are infinite money sinks that never produce much after decades of effort.
I've worked around some of these programs. I've had visibility into some of them for 15 years over which there has been zero forward progress despite unreasonably large amounts of money being spent. It is no secret why those programs are permanently broken but no one wants to have that conversation.
I think most takes on this are overly reductive. The whole situation is sad really.
The root cause, to the extent that one exists, is that no one is accountable for successful execution in a very literal and systemic way. Some parts of the government I've worked in are worse than others, but it is endemic. This leads to a textbook case of Pournelle's Iron Law. There are no negative consequences for a handful of people aggressively maximizing their personal benefit and acquisition of power as their primary objective. This is how you get the fiefdom-building, feather-bedding, and the usual revolving-door corruption that these programs are notorious for.
Most people involved in these programs aren't like that but enough people are that it is impossible for people trying to do their jobs competently to get anything done. The people that defect are the people that end up controlling the programs because that is how the incentives work.
Inefficiency and corruption are a symptom, not the disease. The incentives virtually guarantee that these programs become playgrounds for sociopaths. Average workers on these programs are put in the demoralizing position of either having their good effort constantly undermined by leaders that don't care about the mission and are openly making decisions for personal benefit or to defect to the side of the sociopaths so they at least get some personal benefit out of it. Most of the best and most competent people I know eventually leave Federal service entirely.
A second-order consequence of this is that over time, no one competent wants to work on the programs that are run this way. Through churn these programs slowly fill up with mostly useless seat warmers who don't mind a job where no one expects productive outcomes. It is a kind of stealth UBI for government employees. Some people request assignment to these programs.
You never hear about the programs where the leadership is actually competent and cares about the objective because these actually function pretty well. But the incentives are such that this is the exception rather than the rule.
I'm not even sure how you would fix it, I suspect it is politically impossible. When companies become overtly like this they tend to slowly self-immolate into irrelevancy. Governments lack these negative feedback loops in any meaningful sense.
> The root cause, to the extent that one exists, is that no one is accountable for successful execution in a very literal and systemic way.
Not even the secretary of transportation? Wouldn't this have been a really great way for the previous one to show he can get things done? Or does the position lack the requisite authority?
Somewhat authority, you need Congress to sign off on the money and they will want to influence it to their preferred vendors. Also, when Secretary of Transportation wants to run for higher office, he does not want some boondoggle project that looks terrible hanging over his next office run.
I think the bigger issue with "public" and "private" is that is insufficiently granular, being essentially all or nothing. The use of those APIs in various parts of the code base is not self-documenting. Hyrum's Law is undefeated.
C++ has the PassKey idiom that allows you to whitelist what objects are allowed to access each part of the public API at compile-time. This is a significant improvement but a pain to manage for complex whitelists because the language wasn't designed with this in mind. C++26 has added language features specifically to make this idiom scale more naturally.
I'd love to see more explicit ACLs on APIs as a general programming language feature.
> I'd love to see more explicit ACLs on APIs as a general programming language feature.
In that I agree, but per-member public/private/protected is a dead end.
I'd like a high level language which explores organizing all application data in a single, globally accessible nested struct and filesystem-like access rights into 'paths' of this global struct (read-only, read-write or opaque) for specific parts of the code.
Probably a bit too radical to ever become mainstream (because there's still this "global state == bad" meme - it doesn't have to be evil with proper access control - and it would radically simplify a lot of programs because you don't need to control access by passing 'secret pointers' around).
reply