There is so much "Stallman was right" in all of these examples... non-free software always manages to eventually sneak in malicious anti-user features, where the user has no recourse. At least with free software there's always the fundamental freedom to fork. You think systemd is a Red Hat plot to destroy Linux, then go use Devuan. You don't trust what Google could be doing with Chrome, take your pick of alternatives:
> Whenever someone refuses to show source code I always think, "what are you hiding in there?" There's usually something.
There's a dilemma that developers face when deciding to release source code that's bigger than fear of software theft or the desire to hide something evil in the code. It's a fear of being scrutinized, ridiculed, or humiliated over the quality of their code.
Imagine 2 programs that do something useful and are functionally equivalent. Program A is closed source. Program B has source code available for inspection. Suppose on inspection, program B's code turns out to be bloated, ugly, poorly organized, and with many potential bugs or defects. B's reputation is screwed. However, for all you know, A's code is just as bad or worse. But you don't know for sure.
Bloggers and reviewers will write that no source is available for A.
Bloggers and reviewers will write that program B's code sucks.
The consumer reads that "program A doesn't give you source code" and that "program B's code is garbage", but are otherwise functionally equivalent. Which do you think will have greater influence on most consumers and their purchasing decision?
That's one major reason why more developers don't release source code. I wish I knew a way out of this dilemma.
I'm skeptical that this is actually as big a reason people don't want to open-source their code as you assume.
In any case, there are good solutions for it.
Firstly, if you know from the start that you're going to open source the code, then you'll make more of an effort than (perhaps) usual to ensure that the code is well-organised, well-tested and elegant. At least within your current level of competence, but that's all anyone can hope to do in any case.
Some humility is also required in my opinion. I know many people subscribe to a fake-it-until-you-make-it philosophy, and there's some value in that, but when it comes to open sourcing your code, it's good to check your ego and to be open to suggestions and criticism. There will always be people who are more knowledgeable and better than you at certain things. Best IMO is to accept this (and their criticism if you're lucky enough to receive it) and to see you how can learn from them and improve.
BTW, I speak from some experience. I have had some nominally embarrassing experiences with OSS where other people highlighted relatively obvious security issues with code that I wrote and which I thought was of high quality. However, in turn I got free QA from knowledgeable people and in the process the code improved further as I fixed the problems.
Also, as some people have mentioned already. OSS that is popular gets improved all the time. I.e. if you're lucky, and your code doesn't languish in obscurity, then you'll get patches and pull requests to improve your code.
So the way out of the dilemma... if you're really just being held back because you fear criticism and ridicule, is to ignore the fear, be humble, and to still open-source (perhaps after doing some cleanup, but not to the extent that you use it as a crutch to avoid open-sourcing).
You'll probably realize that the fear was totally unfounded or in the least exaggerated.
> Firstly, if you know from the start that you're going to open source the code, then you'll make more of an effort than (perhaps) usual to ensure that the code is well-organised, well-tested and elegant.
This is assuming you even realize that your code is un-organized and inelegant.
There is no doubt tons of software out there written by journeyman developers who don't know best practices or good coding techniques, but just know how to "ship it"..
I'm not sure if it is a major reason, but it definitely is a consideration. N=1 - I feel those thoughts about my work myself, occasionally.
However, my experience with released software is as follows:
- Open sourced code I've seen is pretty shitty. So is closed source code I've seen.
- Programmers working in open source don't care that much if otherwise good projects have shitty source code.
- Bloggers and reviewers writing for general audience don't mention source quality at all. Hell, half of them probably don't understand what source code is in the first place. And it's OK, because general audience doesn't care about internals either.
- Reviews of application source code are very rare.
The way out is to assume the worst: if no source is available, then assume they're hiding it due to it being "bloated, ugly, poorly organized, and with many potential bugs or defects".
This is a non-issue and you are projecting your own fears on others.
I work on very large projects and frankly, some of the code is utter crap. Still, if someone came along and criticized it without offering patches, they'd be the one looking stupid.
I have not been here long, but my current company has absolutely no culture of shaming code. People do some pretty egregious things, and not just in terms of adherence to style, and their work is still lauded if it runs fast.
The project with open source code can be improved by anyone.
The project with closed source code can remain terrible forever even if there are programmers willing to improve it.
Here's an example:
Looking for a WebExtention alternative to greasemonkey on Firefox. (I know a port is being worked on, but let's ignore that for now)
Your options are tampermonkey (closed source), and violentmonkey (open source)
Tampermonkey has been around longer, and probably has more features. Which do I choose?
I choose violentmonkey, because for all I know tampermonkey's code is garbage, and full of spyware. If violentmonkey doesn't meet my requirements, I can make it meet my requirements by coding the feature myself, or paying someone to add the feature.
This "dilemma" is great. I'd rather find out my code is garbage and be able to learn and fix it than not know and continue writing garbage code.
Fear of that criticism being public is just an ego thing. If you write something people use you will be criticized, if not for the code then for so everything else. Closed- source doesn't help with that.
Some say the only way out is to kill your ego, but that's pretty difficult to do. I think a smaller step that's easier is to simply stop attaching your self-worth to your code. You wouldn't hate a learning programmer because they're bad at coding. Extend that same feeling to yourself--unless you think you know everything, you're still learning. People might not like what you make, but that's an opportunity to figure out why, so you can do things differently next time.
With lower end software (like 'firmware') the reasons could also very well be legal. You may think of copyright infringement there, but I'd more think in the lines of licensing other software [either by source, or proprietary] as part of your software. Examples could be library, or firmware of specific hardware chips as part of an end product's hardware. Lack of open drivers is a serious problem in embedded land; specifically, it keeps phones on old Android versions where security fixes aren't backported.
> There's a dilemma that developers face when deciding to release source code that's bigger than fear of software theft or the desire to hide something evil in the code. It's a fear of being scrutinized, ridiculed, or humiliated over the quality of their code.
But there are really few examples in the world made by one developer. Where's the fear of being scrutinized when you have a team of developers who are "supposedly" doing "code review" all the time?
> think systemd is a Red Hat plot to destroy Linux, then go use Devuan
This was the first thing that came to my mind as well, after reading the post!
> people tend to act more morally when they think they might be watched,
> Whenever someone refuses to show source code I always think, "what are you hiding in there?" There's usually something.
This goes further to all the bias perpetuation engines that the players (size immaterial!) from our software industry are peddling around as a silver bullet. No one know how the blackboxes are built, what biases were built in (unknowingly, or worse, knowingly!), what tests are done and data used etc.
This thread on twitter https://twitter.com/random_walker/status/901851127624458240 , when read with Cory Doctorow's post in context highlights the dangers that are looming just ahead which might go totally unnoticed due to the noise in the system, shrouded by short-term gains but the bad effects which would be visible only in long term.
Free software wouldn't protect you in a lot of these cases though. Fuel pumps could run on free software, but since you didn't install it, it could've been easily altered by the people who did.
Hell these days, you'd have to build the hardware yourself to make sure someone didn't put something malicious in it.
But all of those systems get a lot of easier to access and be checked by controllers, if their base is open and not just some binary blob. Of course that applies also to the hardware.
> Whenever someone refuses to show source code I always think, "what are you hiding in there?"
I think it's mainly just laziness, most devs don't want to be always under review and can live easier lives if they can allow themselves a bit of a mess in their own projects without consistenly dealing with complaints. And many open source users are super obnoxious, bombarding devs with insane questions/requests all the time, then acting super hostile when they don't get what they want right away.
> And many open source users are super obnoxious, bombarding devs with insane questions/requests all the time, then acting super hostile when they don't get what they want right away.
As if this were any different for users of closed-source apps.
This isn't a new demon-haunted world, this is the old, demon-haunted world before nineteenth-century progressive politics, back when "milk" that wasn't half chalk still might have a fish in it (see famous Thoreau quote on evidence.)
We aren't enforcing the laws we have and our grandfathers and mothers had. (Three guesses why.) Not on monopolies, contracts, patent misuse... nothing.
Just this week I and Hearthstone came to a stop - Blizzard's new policy insists on a credit card and that I owe them for purchases made if they leak the card no! I can't sign in to play "my" cards 'till I agree this is totally cool. Sure, the old policy said they could revise it as they liked, but the law says otherwise and always has. They don't care - it'll be years before the law is enforced against them, as it was with Steam and refunds.
No cops - so to speak - on the beat, and Trump vowing to fire more regulators, that's what's changed. The number of potential demons is more of a constant.
“Because bread was so important, the laws governing its purity were strict and the punishment severe. A baker who cheated his customers could be fined £10 per loaf sold, or made to do a month's hard labor in prison. For a time, transportation to Australia was seriously considered for malfeasant bakers. This was a matter of real concern for bakers because every loaf of bread loses weight in baking through evaporation, so it is easy to blunder accidentally. For that reason, bakers sometimes provided a little extra- the famous baker's dozen.”
― Bill Bryson, At Home: A Short History of Private Life
"on the beat, and Trump vowing to fire more regulators, that's what's changed. "
I don't really like to defend trumps sayings, but in this case I don't think there are government regulators needed. The company acted shitty and you took the consequences and quit and avoid this company. That's how market works. No need for regulation here.
The "only" problem with this in general today is, that most people do not understand technology at all, that's why big corporation's can get away with the shit they are doing, as there are still people using their consumer unfriendly, but shiny new products.
I doubt more government regulation would help with that.
Smarter people are required. And I do believe this is happening, it just takes some time.
> The company acted shitty and you took the consequences
Historically, "the consequences" have often included "you die horribly".
Sure, we can all avoid Hooker Chemical Company. (Well no, we can't, because it doesn't sell to consumers and we can't find out who consumer companies are supplied by.) But if we could, it wouldn't have saved anyone living in Love Canal. They got leukemia no matter what the market did afterwards.
I'm not being hyperbolic here. Markets work because they're iterated, voluntary transactions which gradually reveal asymmetric information. If the interaction is not iterated (e.g. fly-by-night companies), markets don't protect you. If the interaction is not voluntary (they poison your air), markets don't protect you. If the first asymmetric exchange is a disaster (you buy tainted produce and die), you never benefit from the gradual reveal of hidden information.
I'm generally a pretty staunch libertarian. But it's simply not true that this problem can be solved with a simple "you chose to buy this"; non-government solutions are vastly more complicated than that.
Erm, I think there is a major difference between those two examples.
If a digital company and me don't come along I quit contract and it is not really anybody else business nor harm, only mine if I feel treated unfair. And the companies reputation, as I will share my bad experience.
But if a company is poisoning the real world, than this is clearly a crime. And I did not say, no more police is needed.
Government regulations exist to fix market failure, usually due to information asymmetry. In the old days it was milk in chalk, in the modern day it's closed source code riddled with spyware and engine firmware that cheats emissions tests. Market failure in my book.
We need more, smarter regulations, not less.
EDIT: I don't think it's reasonable to expect everyone to educate themselves on every technical advance. That's not possible with the complexity of technology people encounter in their everyday lives. And that's without even talking about the invisible technology we never see directly, like the software controlling our voting machines, hospital equipment, power plants, etc.
" in the modern day it's closed source code riddled with spyware and engine firmware that cheats emissions tests."
Yeah well, so the problem to me is not really missing regulations, but missing the will to use open-source.
And that is what I meant, that most people have no idea about technology.
To them it does not matter if something is open-source or not as they do not know the difference - they understand neither, it is all dark magic to them.
And of course, not everybody needs to have studied IT like we did. But I also do not understand the Linux kernel - yet I trust it. Because it is open-source and I can get in touch with the people developing it and see how it is done.
So I trust them.
And ordinary people could at least understand the same: if something is developed in the open, then other people have the chance to check it. If it is closed - much harder.
Very simple. And I have no doubt, that this knowledge will get in the heads of the people. It just takes some time (and action of course) - Computers for everyone is a quite new thing ...
We've become Mexico, we cheerful pass bushel-baskets of regulations and laws we have no intention of enforcing at all; just for show. So we end up penalizing honest businessmen, should any survive.
Koch market-based ethics (see their books) really means - ignore the law, if the market allows you to cut any corner, cut it hard. That's where you end up, with third-world economic rules, and in time results. (Starting with vicious inequality.)
In past this was how I thought, and still is to a large degree and think smarter people is the core, but I do see ways in which regulation can create real positive change. Specifically on the issue of say, how companies are allowed to market. People might be smarter if they are less brainwashed.
hm, I see it more in a way, that people are not so smart because of regulation. Like, if the state allows this to be sold, it can't be fraud. So they don't bother to check further.
And regulation s about brainwashing ... I am not sure how that could really work.
As an exercise, list everything you have bought in the last month or so; food, gas, electronics, whatever. Now list all the ways those products could be adulterated, fraudulently modified, or faked. Now list the specialized knowledge and skills you would need to detect the fraud.
Now try to get some work done while you are caveat emptoring.
> Blizzard's new policy insists on a credit card and that I owe them for purchases made if they leak the card no!
This can’t hurt you if you use a credit and not a debit card. Any false charge on a credit card is not your fault, and it’s not your money. The CC company will pay for it.
It depends on how you define "hurt"; while the CC company will take care of it, it's not instant - and it's no fun dealing with someone going over your credit limit on a cross-state shopping spree.
By all means, please - don't use debit cards online - but it's still not a panacea.
Also, sometimes charges are reversed via chargeback requests, and I have beaten chargeback requests because the customer claimed the CC was stolen, but I had the correct billing info.
Can't you turn around against your bank in the case where unapproved payments would be made with your leaked card ? I mean thats a displeasant clause for sure, but I don't see its implications as gigantic as you seem to do.
"... farmers have worked on their own equipment "for decades, generations even." Brasch also pointed to the emerging DIY sources of information in the world as a way that farmers and others who want to make repairs can learn about their equipment: "You can go to a YouTube for something as simple as baking a cake to repairing or operating an item. I think that's the way the market is moving. We'd like this market to move with the rest of the world."
This is one of the IP/copyright issues being negotiated in the new version of NAFTA (US, Canada, Mexico), as many farmers are affected.
The sad part of the "right to repair" laws is that they're going to force Apple to stop doing the "Activation Lock" feature that has been a real boon to everyone who doesn't steal consumer electronic devices.
Why? Being able to repair a device doesn't mean being able to circumvent anti-theft features; mechanics would be morally obligated to report a device brought in for repairs with such a lock activated.
Doctorow mentions the cases where a printer company has made their software lie about how much ink was left in a cartridge to make consumers replace them more often. I always wondered why a manufacturer would want to do that. I mean, I understand the motive of making consumers buy ink more often, but from the manufacturer's point of view, why would they want to throw perfectly good ink away? Colour ink isn't as expensive to manufacture as they like to claim but it's still worth something. Why didn't they just put less ink in the cartridge to begin with (and maybe lie about how much was in it), instead of lying about how much was left toward the end of its life and throwing ink away?
I know of a large printer company who approached a large media company with a deal that would have had the large media company change the color of the text on their printer friendly pages to something near black but requiring the color cartridge instead of true black. As I recall they were willing to pay seven figures annually. The media company did not take them up on their offer.
The underhandedness/cleverness of the printer companies is not to be underestimated.
I might be missing something here...why would the media company consider it even? Was it because the printer company charged the same price for either ink but had better margins on the colored version they wanted them to switch to?
From an engineering point of view, it's much cheaper to calculate to when 0.01% of cartridges will have run out of ink based on print history vs accurately measuring ink inside the cartridge.
Of course that means that 99.99% of cartridges will be discarded with remaining ink, some of them with a rather significant quantity of ink. And it gives the opportunity to tune this number so cartridges are bought earlier. Although it's up to the manufacturer just how much ink is included, so the usefulness of this tactic is somewhat limited.
I remember what is was like when you'd print a 50 page document only to discover around page 24 you'd start seeing white streaks...and more streaks, and by page 50 you'd have nearly blank pages. Having the printer stop and let you put in a new cartridge when ink is low is a helpful feature.
However, the lockout of third party or even refilled cartridges is just egregious.
If I were designing my own printer and wanted to both fully utilize ink cartridges (maximize efficiency) and also be able to print absolutely perfect pages when necessary, it'd be a simple solution: two cartridges. One that is potentially "low" and is used for printing low quality stuff, and one that is guaranteed to be full and is used for printing high quality stuff. Presumably, the "low" cartridge would run out first, and then you just move the existing "high" slot one down and put a new cartridge in the "high" slot. If instead the "high" one ran low first, then replace it with a new one and queue the old one to be the next "low" one when the existing "low" one ran out.
> One that is potentially "low" and is used for printing low quality stuff, and one that is guaranteed to be full and is used for printing high quality stuff.
It's a good idea, but you do realize that realistically, people would just swap the two out, and in a few weeks or months would just have two near-empty cartridges?
I used to have a truck with two gas tanks, and a switch that would let me toggle which one I used. I would usually be pretty diligent at refilling every time I had to switch over, but there were several occasions when I found myself with both tanks dry.
Close, just have an internal tank that contains ink for printing that is refilled by attaching cartridges. When the cartridge that refills the tank is empty, it's time to swap it.
Delete cartridge, fill in paint directly, and voila we are talking about a high throughput ink printer. These don't have, or at least didn't use to, cartridges, they draw ink from tanks and wet an ink sponge with it.
It's probably to avoid/mitigate lawsuits. It's possible that this would still be covered under false advertising but there's something to be said about the feasibility of testing a lot of unopened packages to show the ink level is low vs having to run all those packages down within the parameters of the device (so the testers aren't seen as the cheaters, over-using the cartridges to make the problem seem worse than it would be for "regular folk"). They'd be dead-to-rights under-filling but they have a better shot (and a better potential settlement) with the under-reporting.
There's a lot a wiggle room in saying that your ink is almost out. The simple defense is that they don't want people to have wasted prints, which might happen if the ink for one color actually runs out, so they notify a little earlier. That "little earlier" is vague and relative, giving a lot of defensible ground for a lawyer to work with if it comes to that. Stating you are selling a certain amount of something and not actually delivering that amount is illegal, and fairly cut and dry at that. It's not hard to see why one is more likely to be attempted than the other for a company.
Right. Putting a different amount of ink in the package than is indicated on the label would be out-and-out fraud, and easily detected. This route operates firmly within a legal grey area.
Plus, increasing the actual volume of product you sell without relying on a commensurate increase in demand is basically printing (heh) money as far as corporate returns are concerned.
If 95% of cartridges with 20ml of ink will only allow 18ml to be consumed, is the 20ml an honest figure?
Related to the above, and possibly specifically because of the above, I've noticed that HP doesn't document ink quantity in cartridges, only approximate cartridge yield in pages.
Oh, and the other thing that drives me batty. Their XL high yield cartridges are sold with "Get up to 2x the pages", but from page count it's clear that it contains 1.5x the amount of ink. So ya, you could get 2x the pages, but only if your pages use less ink. There's a disclaimer, but the link is broken.
This is one of those ideas that seems obvious at first glance ... and then problematic with more thought. Have you ever gotten 100% of the peanut butter out of a jar? I haven't. Is that fraud? There's a distinction to be made between jars of peanut butter and ink cartridges, but it wouldn't be easy to draw that line. (For an even better example, see Merchant of Venice, Act 4, Scene 1 ...)
> Have you ever gotten 100% of the peanut butter out of a jar?
Peanut butter? No. I've gotten really close to 100% out of most containers of liquid though.
The exact location of the line between them might be hard to pin down, but when comparing things as dissimilar as that, it's a great big fat line so the overlap of where you put it and where it should be is probably pretty large, making it easy to place with some confidence.
Having the device lie based on the age of the cartridge makes for a more predictable sales volume in the future. See also, inkjets that "clean" the heads by running ink through them on every power cycle.
Sensors are expensive. Counters are cheap. Put in ink by weight. Assume pages at 60th percentile use of each color. Set counter. At zero, whine for more ink.
Whining would be fine. The evil part is completely disabling a cartridge because of a color you aren't using is low, or that the cartridge is arbitrarily too old.
Counters are cheep; they /could/ count the actual requested ink coverage (in actual volume of ink product) across the lifetime of a cartridge.
That would at least give a better estimate and you'd only need to update the batch counter a couple min after a print job (just in case there's another one).
That would allow for a great reduction in programmable memory write cycles. Careful programming would also allow the use of a ring buffer that 'loops' with an erase the ROM page to vastly reduce the write count per cell.
Sorry to hijack the thread, but what is considered a good printer brand nowadays? My requirements:
- Someone who doesn't treat their customer as a criminal (e.g. no DRM in cartridges games, please)
- Works on Linux/MacOS/Windows.
In particular, I'm looking for a printer for my home. I don't print very much, so an inkjet is probably out if the cartridges still have a habit of drying up in a few months.
It's going to be a compromise. This stuff changes all the time, so I can't name any specific brands, but some general advise:
- If the printer is cheap, chances are they make money off cartridges. Expect to pay at least a thousand, but try to get something that will last for that.
- "Pro" lines are less likely to have issues like this. Separately, ditto laser printers. Get a laser printer, color if you must, but greyscale means simpler mechanics, lower price and better reliability.
- Laser printers use ink powder, and won't dry out. Uh, just be careful when refilling.
- Laser printers suck for photo printing, and quality in general is meh. This is something you will have to live with, as an inkjet printer that goes mostly unused will break rather quickly.
- Some brands (at least used to) have refilling the cartridges as an explicit feature. I think that was Brother? No clue if that's still the case.
- Pretty much everything works on all OSs these days, but your experience will generally be better on Linux. Yes, this is quite a change. That said, you might like a network printer better, especially one that can connect to Google's cloud print system. This will be listed on the feature set.
As a consumer I don't care how much ink is left or wasted. All that matters is the price per page. You can find those prices in any good printer review.
This will not work if the thing lies based on the age of the cartridge as it will mean the price per page is much more expensive for people use the printer less.
In fact your attitude will probably systematically get you ripped off if you are an ordinary consumer. Since the price-per-page calculation in the review probably comes from an assumption that at the cartridge gets fully used, and the test was probably under a highish workload.
Right, and it has been. It's hard enough to find that most buyers of printers won't realize it until after you buy the printer. The normal situation was to switch to 3rd party ink, which is now disabled.
HP is clearly abusing their position and printer ink is one of the most expensive liquids you can buy, even more than exotic $10k a bottle wine or perfume by volume.
If the company simply puts less ink in the cartridge there will be a difference in durability to third party ones. By simply cutting of the cartridge at a certain point they guarantee that their cartridges last as long as third party's and neither last long.
Lying about the amount of ink is pretty easy to proof. So they would be in trouble quickly. Misoreper4sting the ink level is much more diffuse. The strategy is to lie in a way that won't get you in court.
More and more I see gas pumps ask if you want a receipt BEFORE the gas is dispensed. This seems risky.
If you decline the receipt and then dispense gas, the pump could cheat on the amount of gas dispensed with less risk, as a papered record of the purchase amount and price is not produced.
If on the other hand, the pump waits to ask if you desire a receipt until after the gasoline is dispensed, the dispenser will not know if a written record will be requested, and cheating the customer is riskier.
Therefore, I always request a receipt if asked prior to dispensing my gasoline.
Maybe related, we had an event in my small town where a customer pressed diesel, but got filled with regular unleaded. His receipt helped him when fighting for some damages.
Without a receipt there are two things that can be a problem: 1) Less gas is pumped, 2) Charged more than the pump said.
True the receipt protects against 2) more than 1) because the car's tank does not have an accurate guage.
I remember listening to a story once about a fraud at the gas tank in which the normal amount of gas was only given out when requested in an even increment a five gallons which was how much The Regulators requested when they came and tested a gas stations pumps. they were caught after a spicious regulator requested an odd amount of gasoline and got the wrong amount.
with a combination of receipts and camera/ML classification that classifies when a customer is filling a tank versus filling their car will allow avoidance of weights and measures regulators. This is a problem I foresee.
Gas is common enough and such a hot button issue that mass fraud like that would be easily noticed on credit card statements and in the tank. It's easy to test, confirm and verify, and can be done by just about anyone. Pull up to the station, fill up your gas canister, and check the price, volume etc.
The places where this fraud is most likely to exist are niche markets and places where it's hard to verify. Walled gardens of software or hardware with few outputs to measure.
The hard part with that is, see the sticker on the outside of the pump? That certifies that this pump meets your state's regulations and that someone has inspected it recently. In order to have a coverup like that, you'd need the government to be in on it too.
Wasn't the danger of encountering devices (somehow) programmed to act differently in a regulator's test regime than in daily use one of the points made in the article? (Actually achieving this in the gas pump situation is left as an exercise to the reader.)
I don't know about that - all that proves is that the pump hasn't been opened since inspected. While I suspect even that isn't foolproof, there's nothing stopping someone from modifying the pump through external means, whether through a remote firmware update or mechanical means, like a magnet placed on the outside of the pump that engages some hidden mechanism.
What sort of weird gas station requires you to specify the amount of gas before actually refueling your car? Is that a USA-only thing? Never seen that in Europe.
I don't think OP was saying that. The pump asks Y/N if you want a receipt before you pump, but prints it after the pumping.
That said, you do specify the amount of gas if you're paying cash here in the US. You go inside, say "here's $20", and the pump will dispense exactly that.
If you're paying with cash you usually have to prepay. If you're paying with card the station does a preauth on your card (and usually a temporary hold) then does the final authorization after you finish pumping and it knows the total. Sometimes you have to pump twice if you want to go over the preauth hold amount, which is usually $50.
It's worth noting that it wasn't that way the entire time I was growing up. You would pump first then pay the cashier. It wasn't until around 2005ish that I noticed signs instructing cash customers to prepay.
It's not required, but, in most self-serve stations I'm familiar with (Costco being the exception), you can prepay with cash (e.g. ask the attendant for $20 on pump 5, or whatever), and the pump can be set to stop after $20 worth is dispensed. If less than $20 in gas fills your tank, you can get change from the attendant. This is commonly done late at night or for pumps not visible from the attendant's vantage point, to prevent drive-offs.
I've seen it in Norway, Italy, and somewhere else before. It's not terribly common—or it's usually optional and most people don't realise you can do it.
Also in Italy, if you pay cash at the pump you have to specify the exact amount. And woe to you if you over estimated how much you'll need < there's no change.
This is common in Iceland as most pumps let you pay with card at the pump so you pre-auth from a number of options and get charged the actual amount you pump.
I think they are using the receipt prompt to hide how long credit card authentication is taking.
They can't easily cheat on prices as usually there are large signs displaying the current price. They can't easily be cheating on amount dispensed, as the pumps are regularly checked by the local Weights and Measures Department (doesn't rule out a VW like cheat though).
So, your gas pump has to be accurate for the first five gallons, but then you can cheat as much as you want.
The only saving grace is that the people who make the pumps and software wouldn't be the main benefiters from cheating at the pump. So probably we're actually okay for gas pumps. It's situations where the software creator also benefits from the sales using that software...
But maybe I'm wrong. Maybe independent gas station operators are buying bootleg software ROMs to flash their pumps... "Boost revenue by 10% with this one easy trick"
As much as I am confident many pump operators out there would do anything to make a quick buck at the customer's expense, all it takes is one time metering more gas into your tank than what your tank is able to hold, after which you know they're lying and regulators will get involved. Even if not, word of that kind of thing spreads and it won't be long before that filling station's competition starts stealing its business through word-of-mouth.
I see the misunderstanding. You're thinking that the pumps should be designed to always give the receipt afterward. Yes, that would be the ideal way to reduce the possibility of cheating.
However, in the pumps he's talking about, the machine prompts you beforehand: "Click here for a receipt". The receipt decision (yes or no) is always before. There is no way to ask for a receipt afterwards.
He always asks for a receipt (even if he doesn't need the receipt) because he figures that the machine is less likely to cheat him in that case.
Form a company that explores new markets in legal liabilities. It could bring lawsuits with little risk where the payoff could be billions of dollars. Off the top of my head:
* Research whether channels were engineered into smartphones to allow water to leak in (since they have no moving parts and should self-evidently be watertight).
* Find the planned-obsolescence parts in things like car doors that were engineered too thin or out of plastic so that door and window handles fail after a certain number of uses.
* Find evidence that companies opted to use proprietary battery and charger form factors which drove up prices and prevented interoperability.
...the list is nearly endless. Most of these seem like they depend on research or whistleblowers. If the free market and regulations won't prevent this kind of widespread hacking then maybe lucrative opportunities could be found working within the courts!
That's not remotely how waterproofing works. Getting any meaningful degree of ingress protection is hard. It requires entirely different assembly techniques and is generally contrary to user serviceability.
Proprietary batteries are the reason thin electronics exist. You can't use 18650s to make a MacBook.
I think the problem here is that you vastly underestimate how hard manufacturing is. The things you're proposing are like me trying to sue Facebook because it crashes all the time. Is it annoying? Yes. Is it because they're actively trying to subvert me for nefarious reasons? No, it's just because they don't know how to do it better in a reasonable price range.
No one thinks its free of any effort. But $10 wrist watches claim 100m waterproof (whatever 100m means in watch land). All I want is my phone's screen not to mess up when if my buddy pushes me into the pool. 2 ft of water for 10 seconds. That's it.
Please, these things are not mechanical watches, which have been able accompany divers for literally centuries.
Speakers are trivially sealed, as are buttons. They are just a fancy electromagnet, or piezoelectric crystal.
Also both can work through induction potentially keeping them outside a "pressure chamber" and forgoing sealing altogether. Although I just want the steam of my shower not to mess up with my phone, not dive 100 m with it.
Yes. Connectors like USB and headphones consist of metal "fingers" in a plastic insulating shell, perhaps with an outer metal shell. The contacts are sprung and need to move. So the simplest, most natural implementation has gaps around the contacts. There's also usually a thin gap between the connector on the PCB and the outer casing of the phone. If you want it sealed you have to add gaskets everywhere during assembly.
Not to mention sockets for SIM and SD cards. If you can take the back off the whole phone to change the battery, that makes it a lot harder.
Many ports, and notably headphone jacks, have springy pieces of metal inside to clamp onto the plug. This is what causes scratchy audio when you wiggle the plug in old devices, the clamping force has weakened.
> Find the planned-obsolescence parts in things like car doors that were engineered too thin or out of plastic so that door and window handles fail after a certain number of uses.
There’s no way in hell car companies are purposely under-engineering things to cause them to break. That damages the brand reputation, and will drive the next sale to a competitor.
Yes, they make engineering tradeoffs. No, they don’t design a part with the specific intention that it will fail earlier than a comparable (cost-wise) design would.
There are a lot of examples like this. That timing gear in your linked article is one example. The rear emblem (and trunk key cover) on a 90’s Nissan Altima is another one. They have all fallen off by now. Crappy pot metal?
Either way, it wasn’t intentionally designed to do that. There was no engineer that said, “Hey, let’s juice our parts revenue in 5 years by making this hinge really weak.”
EDIT: Thanks for that article, though. I could see myself getting suckered into a good deal on a Benz. Now I know why the price is so low.
Watertight is a hard engineering problem to solve. If it isn't a vacuum or loaded with some other gas then you will get water coming in because of temperature cycles causing the device to act like a pump, which in a phone are a problem because you will have temperature fluctuations caused by charging, usage and environment.
Reminds me of an idea from the novel "Makers", by the very same Cory Doctorow, in which there's a mass VC-funded lawsuit against a particular known media megacorp.
Devil's advocate here... I would have hoped the opportunities were so lucrative that whistleblowers would have already, eh, blown the whistle. That's exactly the point of the whistleblower's protection laws.
So maybe these things aren't as widespread as you'd think... at least not in a legal sense. I'd also like to think that maybe people aren't "designing" flaws into things rather than finding the maximum economic benefit, or just are "naive" to what they design.
> Form a company that explores new markets in legal liabilities. It could bring lawsuits with little risk where the payoff could be billions of dollars.
Sort of like a patent troll, but for good? I do like it.
I think there's always a danger of having this type of litigation be abused (in this case I'm also thinking of drive-by litigation around the Disabilities Act etc.)
edit aww spitfire beat me to the patent troll punch.
Why do our phones, which certainly felt damn snappy the day we bought them, inevitablely seem to slow down to the point of unusability after a couple dozen months? Even after a factory reset and installing no apps at all... I know it didn't take that long to open/close the built in apps when I bought that iPhone 4, 5, 5s...
The only thing I can think of is the flash drive is slowing down as it wears. Or, the CPU clock rate is programmed to progressively lower itself the longer it runs.
Has anyone done the performance analysis on used phones to prove this isn't just my brain moving the goalposts as hardware improves, or apps just slowing down as they bloat, but that the old devices really and truly are running the same software significantly slower than when they were new?
Next time I get a new phone, I'm going to put together a performance testing plan and record a video of it on the new phone. Then any time I wonder about this, I can run through the plan again and compare with the recording. So, ask me again in a few years.
Please publish/open source the testing criteria, so others can do the test and submit their own results. That would be awesome. I would probably run around the house and test all the phones in my family.
Even something as simple as Github and using pull requests for those that know how to do so (or manually just adding for those that email and don't) would give a lot of introspection, and allow you to share commit bits so people could help (as well as making it easy for people to clone to run analysis on if they desire).
Thanks for the idea. I just created a list of built-in apps, set them to point to some deafults that do not depened on the network performance (for instance, Calendar only includes Holidays, and displays year view), rebooted and measured time to start them in a succession. Apps that depends on internet are excluded.
While this is not super accurate because OS and app updates could affect their startup time, it is still something measurable. It will be interesting to see if a year from now a set of same apps with basically same functionality will take more time to start on the same (or newer) hardware.
I think it's OK for OS updates to affect it. Ideally you'd measure both, but measuring it with updates is the more useful of the two. The fact that security fixes don't get backported very much essentially makes updates mandatory, and as a user it doesn't really matter whether slowdowns come from hardware or software. The main thing is to confirm whether the devices really do get slower, or whether it's all a matter of rose-colored nostalgia glasses.
I don't get this; everyone says it, but I've never seen it in smartphones. Laptops, sure, but Windows really DOES accumulate a lot of cruft, and OSX's updates never seem to improve performance.
Smartphones, though? The only slowdown I've noticed is when the battery starts to go. And then you can usually tell; the device gets hot as the battery pushes up against all the ESR it's accumulated, the CPU steps back a bit...that's what I figure, anyways. For me, it's always coincided with the same inflection point when I start really noticing things like charge cycles taking much longer and a sharp drop-off in the life of a 100% charge.
I've experienced slow-downs with both iOS and Android devices. They almost always corresponded with newer versions of operating system and all the extra functionality that came with the upgrade.
The apps are also evolving. Compare Facebook today to what it was two or three years ago. The app does a ton more and all that extra functionality doesn't come for free. I suspect one of the primary reasons they broke messenger out into a separate app is that the Facebook app was simply too damn big.
There have been instances where an upgrade has made my experience better, but the trend has definitely been in the other direction (for me anyway).
I also have another concrete example: the original Nexus 7 had an SSD that didn't support trim. Over time it became nearly unusable as the SSD became fragmented. It was also severely bandwidth constrained. The device was decent until the first major Android revision came out. At that point, the new version of Android so significantly increased the bandwidth requirements on the SSD that it effectively strangled the device.
What was an awesome device the day I got it became absolutely unusable. Eventually I installed a version of CyanogenMod and formatted it with an SSD friendly file system which made it passable again but, still sucked compared to the day I bought it.
I still have the original Nexus 7 and it's unbelievable how unusable it has become. I use a slim ROM of KitKat (the Android version it came with) and it's still so freaking sluggish. I only use it for watching videos now, it's unbearable for anything interactive.
It used to be quite nice with the slim ROMs, but at this point nothing has helped to get it back to run like it used to. 4years of light use (99% of time sitting idle).
I've had great success with ParrotMod[1] on one of my devices. But the dev doesn't have the device any longer, and the mod doesn't seem to work on Nougat.
This Android 7 Nougat build[2] should have many of the ParrotMod optimizations out of the box, however. It works great on my other Nexus 7.
Just a note on messenger - they actually separate the two purposely. They know Facebook is 'dying' with the new generation, and are pushing messenger more. To that end, they are purposely separating the two, so as to seem as two separate entities. They don't wants Facebooks old legacy to bring down the new flashy messenger.
I've felt it on my Samsung Galaxy S4, and on my SO's Samsung Galaxy S3. Both of which are high-quality devices, not some random el cheapo smartphones. Over the course of two-three years they both went from smooth&crisp to annoyingly laggy to (in case of S3) pretty much unusable even for making phone calls.
Now in this story I'm just an alchemist with no experimental controls, but then again, the experience is not isolated. I'm not saying it's purposeful software slowdowns - maybe it's general bloat + flash memory wearing out? I don't know. But seeing as others report similar cases too, something is going on.
I've had a succession of Android phones made by Samsung. Each of them has become nontrivially slower over time, seemingly as a product of OS updates. I've even timed startup before and after installing updates as a crude check; startup at least gets measurably slower every time I update the damn thing.
Obviously regular updating is important for security, but I sort of suspect users would be more open to updates if they weren't consistently harmful and occasionally catastrophic.
I've experienced and noticed a different problem. I've had the same phone for almost 5 years (have a hard time replacing something that still works well and is in good condition), and I've found it's not the speed that has worsened and I can live with not having certain features to run some new apps; it's the storage.
My phone's storage is not large by today's standards, I have 16 GB. I run fewer apps than I did when I got the phone, and have run the same general collection from day one. I have less music on my phone than I had in the past. I have only had to continually remove applications and music as just about every week it has warned me that my storage is almost full. This only started within the last year. I won't have added anything new, keep my cache flushed, etc and I still have to continually remove items.
Is there something practical I'm missing here, or... ?
Apps are getting larger due to more and more code accumulating as time goes on and apple's toolkit / compilers causing binaries to become larger.
Swift itself creates incredibly bloated binaries compared to objective-c. More people are using more 3rd party libraries than 5 years ago, where you just used apple's built in libraries for the most part. If you just used the shared system libraries, then that didn't count against your binary size.
I figured that aging of the tech vs package size would be an issue, including the growing size of the OS... but do you think this is a problem even in long-life applications? I mean, surely AAC and MP3 files haven't bloated so significantly in the past 5 - 10 years.
I haven't worked much yet in Swift and wasn't aware of those differences vs Objective-C, though. That's interesting.
MP3 hasn't changed as a standard or probably as a codebase for decades. Your apps are changing every week, and apps tend to grow in size as more and more code & features are added to them. It's a natural consequence of most software projects.
The binary size of swift projects doesn't catch most people, only large projects with > 400k lines of code as they go over the cellular download limit. It's the new hotness, so many projects are adding swift even if the app is old.
Adding swift to a project adds ~20MB alone for the standard libraries that get included with each app, and each line of swift creates that much more binary bloat. The autogenerated interface between swift and objective C creates even more bloat.
That's an interesting problem. Do you find it's that bad for a new project as well? I've otherwise heard nothing but good things about working with Swift, and it's the only macOS/iOS-specific language I've actually worked with a little bit for its rather quick learning curve.
So I guess the problem is pretty simple, if not immediately obvious for the details of it.
I sigh and guess that means I just have to upgrade soon...
The swift scalability problems start coming up around 70k lines of code, so for the typical case of 1 or 2 developers it doesn't come up. But if you start getting a large codebase:
* indexing will take forever
* xcode will beachball continuously
* the debugger will crash because the codebase is too much to handle
* build times will be 4-5x more than the equivalent obj-c codebase
* startup times will balloon significantly if you have many separate libraries, since using swift forces you to use dylibs. 60-100 libraries will mean 3-5 seconds being added to your app startup time
They are all solvable problems, but it will take several years before they are fixed. But I don't think swift will be better than C++ development, since they are both similar languages as far as compiler tradeoffs go.
I notice some lag even in small swift projects compared to objective-c, but it isn't major enough to worry about.
This happened to me too on an Android phone. It turned out according to the web to be some logfile that keeps growing and which I don't know how to delete -- it doesn't show up in the system-provided views of what's on there. (I'm sorry this is so vague, I'm not interested in Android and it's been quite a while since I gave up.)
It's still running because I deleted all the apps except a few core ones, but I'd appreciate some tips about this too.
How about this simple hypothesis: Android is suffering from the same problems all Windows had prior to Windows 7 - drive fragmentation. Cheap flash will not give you nice access times when data is fragmented, and over time fragmentation accumulates.
Why wouldn't flash memory give good access times with fragmented data? It's not like it needs to move a head around like a hard disk. Shouldn't one block be as accessible as any other block?
Depends on what you mean by "block". Your OS is probably interacting with the SSD in terms of 512-byte logical blocks, but most likely grouping most I/O into 4kB or larger chunks due to the 4kB virtual memory page size. But the underlying NAND flash has a native page size of more like 16kB, and an erase block size in the MB range, and an internal RAID-like striping of data across multiple channels on the SSD controller. Sequential access is still much simpler and faster than random access.
Could be the drive slowing.. but definitely a big chunk of loss of performance comes from OS updates (i.e. that were not tested on the now old device for performance and battery discharge rate as aggressively as initial release)
Are you comparing identical OS versions and an identical amount of apps installed? Newer versions often make everything slower. I remember when the iOS 6 -> iOS 7 update happened, the exact same code run on the same device with different OS versions ran %20 slower.
My iPhone 6 used to be snappy, but now takes almost 10 seconds to just open an app. Perhaps new OS updates are not as well-optimized for older iPhones as they are for the latest version.
Ars Technica's Android 8.0 deep-dive has some interesting charts showing Android device performance deteriorating over time:
I'm not sure that the new updates are doing much to harm older phones. My iPhone 5 is still running as quickly as it was when I got it. In fact, the latest updates seem to have made things somewhat faster.
Unfortunately I'm now at end of life as far as iOS updates go. I don't think I will bother with updating the hardware for some time yet.
If you're on a Mac, try Coconut Battery to check the battery health. Mine was showing 30% of original capacity, which explained why it kept draining quickly. Resetting (recalibrating?) it by letting it drain completely and letting it charge got it back up to 80%. I don't know why it had gotten into a bad state :(
> I don't know why it had gotten into a bad state :(
just ordinary use makes them that way. Calibrating is necessary for all lithium batteries, but normally you charge when you think its time and don't wait for the battery to get empty first
My own iPhone 7 is plenty fast, but I have noticed that the older models seem to be getting slower.
Presumably this is because Apple keeps adding features and animations and so on, and so iOS grows bigger and more resource-hungry, as are apps, and developers aren't optimizing by running their stuff on old iPhone 5 models anymore.
Perhaps not over months, but over years OS bloat becomes nontrivial. Subjectively, my phone appears to slow down after most updates; objectively it definitely loses usable space to the operating system. App bloat displays a similar problem, albeit less blatant.
This seems to mirror Wirth's Law more generally; every piece of software on your phone gets more demanding every time it updates.
I've got a Galaxy S3, an S4 couple first generation Moto Gs, a 2012 and a 2013 Nexus 7, a Note 2, I think? 2 Note 3s, and a One Plus One in various states of: loaned to cousins, used as house phones, backups in a drawer, backups in cars, or lying on my desk.
They were all either broken, at yard sales, given to me my clients / contacts that don't want them, or were <$20 on ebay.
In general four things kill these devices:
Touchscreen breakage. It is almost never worth trying to replace if the screen cracks.
Flash burnout. Shitty flash chips don't last forever. I've binned almost every older phone than this crop because the flash memory dies.
Charger port wear. Microusb sucks, replacement parts vary wildly depending on model - I can get an S series charger for <$5 most of the time, but trying to replace a Droid phone charger once was impossible because the charger harness was soldered to the pcb.
Software. I generally outright ignore devices without a ROM scene and an unlocked bootloader, but even then it is entirely volunteer how long Cyanogen/Lineage/Paranoid/etc are willing to keep supporting these fossil kernels. The S3, Note 2-3, and original Nexus 7 are all on their deathbeds because of lagging community support for these devices. It is worth mentioning, however, for the Samsung devices they have gone community supported far longer at this point than their official support periods lasted. Great job Samsung.
Batteries are usually a non-issue. You can buy shitty Chinese knockoff batteries (or if you are lucky Anker) that don't hold a charge and don't last long, but you can keep these devices running on bootleg parts for a while.
The software is the ultimate killer. What should be the easiest to maintain is the hardest, because corporate greed and hunger for control trumps customer respect. All my mobile devices are cheap, used, or broken when I get them because none of these exploitative abusers are worth giving a direct cent to.
Sadly mobile flash doesn't support smart monitoring. There are three indicators, but your flash can randomly fail without any of them being observable:
Sector reallocations. As flash stops writing or reading the package will reallocate data. This process is intensive and usually lags out the phone. If when moving large amounts of data into / off the flash the whole phone is freezing, it can be due to this.
Stunted read / write speeds. As the flash degrades and more sectors go bad, your read and write performance suffer. Fragmentation gets worse as working sectors dry up. If your phone was benching ~80MB/s read or write speeds the day you got it and is down to ~20 5 years later, it is likely nearing a failure point. This is usually a gradual aging thing, but you do often see a steep slope of sudden performance crash before the whole chip becomes unusable.
Crippled access times. The former was data rate, this is data latency. The latency should always be consistent and not age much throughout the life of the chip - the ability to access flash almost always stays near-constant over the lifetime of the chip. If this starts going, for very small data sizes, the chips controller can be dying. Which happens, because in phones a lot of corners are cut, and flash mmus are often really, really cheap.
There is also the really rare chance you find a corrupted file you cannot open that used to work, which can in extremely rare circumstances mean that your phone has ran out of unallocated sectors and is now losing capacity including written data, but that is highly unlikely - flash almost always becomes unwritable way before becoming unreadable, and your phone will fail before unreadability starts manifesting en masse.
It would be useful if we could get A. lifetime write averages for the flash chips in popular phones and B. trace such a number throughout the lifetime of the device, but we don't have those, so you are almost always flying in the dark on when your phones memory will die.
Yup. My wife is very fond of her Nokia 1020, which is now four years old. Gorilla glass means the screen is still in mint condition, but the whole thing is being let down by software and Flash issues. The camera app now crashes frequently and there are frequent pauses when trying to save photos or other things.
> There must be anti-trust enforcement with the death penalty – corporate dissolution – for companies that are caught cheating.
This was the norm in the US until the late 1800s. Indeed, corporations had to act in the public interest. And if they didn't, they were dissolved.
But then, the railroad corporations got wealthy enough that they were able to buy favorable Supreme Court rulings. Basically, they got human rights. After former male slaves, but before women.
This Harvard Business Review article [1] interviews a reputable professor on the topic. America adopted English law corporations, replacing the Crown with the states. Early corporations were limited compared to today's because early enterprise was fairly limited--the American Revolution occurred on the cusp of the Industrial Revolution.
You are correct in stating that it took political capital to form a corporation in Revolutionary times. But that didn't mean they existed at the pleasure of the state. Furthermore, the close nexus between corporate chartering and political proximity showed its seams in the 19th century. That's why states moved the chartering process to independent bureaucracies.
Hmm, I suspect the article is not really news to the people who frequent sites like this, and perhaps not even to the readership of a science-fiction mag like locus.
But I would like a nice readable article like that to appear in more mainstream publications. It should make a good story, being both true and sensationalist and important at the same time.
> being both true and sensationalist and important at the same time
Doctorow has been sensationalist for as long as I've been reading him (15 years?), I think even moreso as time goes on. Too often, it detracts from his point, but sometimes he hits the nail on the head.
I really enjoy Doctorow, but he's been getting into territory where the hyperbole becomes factually inaccurate. His coverage of the Wells Fargo debacle featured some headlines that simply weren't true, which really put me on edge for reading his other stuff.
I'm surprised no one has mentioned Tesla in this context. Not only do they make it almost impossible to get a Tesla car repaired anywhere but their service centers, but they also collect a ton of data [1].
>instead, it tries to trick the reviewers, attempting to determine if it’s landed on a Car and Driver test-lot, and then switching into a high-pollution, high-fuel-efficiency mode.
This has actually been the case for some time. The car magazine wouldn't just go borrow a car, they would get one directly from the manufacturer. And the manufacturer would send a ringer, a vehicle with an EPA test-exemption that doesn't have to comply with any emissions regulations.
I suppose the era of Youtube car review channels is bringing that method to a close though.
I'm curious about the WannaCry situation. If the killswitch was truly to detect being in a VM, could they still not have bought the domain and just left unresponsive, or even better, just generate a random new domain every single time.
I guess they just didn't foresee someone buying the domain.
And how effective is that defense, anyway? I mean, couldn't you just clone the VM, and start responding or not responding to different domains until you hit on the right one? Or hell, have your VM check a domain and respond correctly if the domain isn't registered?
Yea, this part was confusing to me as well. How would a VM "simulate" the internet? Does it return a dummy page? I don't understand what mental model of VM behavior that the virus creators had in mind when they designed this killswitch.
The part about how to simulate a site isn't confusing, the part about why you'd want to do that is the confusing part. If I was researching malware in a VM I'd want to take a very passive approach, and allow it to talk to the real internet, and inspect the traffic.
By intercepting every DNS request and routing it to your honey pot server which logs the packets the malware sends to the mothership. What the malware author should do is simply randomly generate the DNS name at the cost of the unlikely chance of hitting a valid domain name instead of hardcoding it.
It's called a DGA or domain generation algorithm, I believe Conficker was one of the first botnet's to use this approach about 10 years ago, it's pretty standard trade craft nowadays.
Well this is slightly different. DGA seems to actually use it as a rendez-vous host, whereas in this case, it literally is a dummy domain that they didn't even bother registering, with the only purpose of checking if they were in a VM. In which case, generating any long hash domain would most likely not be registered and would do the job.
Don't know about iTunes, but much along these lines Sonos removed support for Windows DRM and ticked off customers who had their music stored in that format. Amusingly, the link I found about it is also a post by Cory Doctorow.
I suspect the Kindle feature he's talking about is the text-to-speech feature, which the publishers hated because it threatened audiobook sales. Or maybe he's talking about Amazon deleting books from customers' Kindles?
The reason hackers get Sony is because they installed rootkits in everyone of their window's consumers. Then they apologized, released a "fix" that only hid the rootkit.
Besides, it must be fun to keep hacking the same company and seeing that they haven't changed anything in their IT.
I'm pretty happy to have not bought a single Sony product since 2005. I think that people should boycott brands that spit in their soup, but Doctorow makes a better point: these companies should be killed.
I remember that one, although IIRC they did it because Linux-on-PS3 was being used as a way to get around the copyright restrictions on the PS3 and pirate games. Basically, the pirates ruined it for everyone.
Sony didn't really have a choice. They have to protect the integrity of their system as best they can, which means closing any holes that are used to circumvent the copyright restrictions. Besides just being important to protect sales on their platform, it wouldn't surprise me if there aren't contractual obligations here as well.
That doesn't really have any bearing on Sony's contractual obligations as a console vendor. And you'll note PC games are often DRM-laden, and the DRM vendors surely have their own contractual obligations to do as much as they can to prevent their DRM platform from being compromised. Similarly, I know for sure that back when Apple sold DRM-laden music, they had contractual obligations to the music labels to fix any FairPlay holes they could.
They should be given kudos for even trying, not blame for failing at what appears to be an impossible task. Sony thought they could offer Linux-on-PS3 and still keep the PS3 secure. For a while it seemed to work, but eventually it was demonstrated that Linux-on-PS3 broke the security model of the PS3. Sony really had no choice, because it's a gaming console first and foremost and they had to protect that. Yeah it sucks for everyone who was interested in using it as a Linux machine, but if you can't recognize that Sony had an obligation to protect the gaming console over the Linux support, I don't really know what to say to you.
May have been the one that "helpfully" deleted your music, and replaced it by compressed version, even if you didn't bought it via iTunes. Or deleted your files period, if it detected a not so "equivalent" version you could download.
Supposedly to save local disk space.
I recall an accounting of a composer losing his work file that way —they got compressed behind his back. Thank goodness he had a backup.
Certainly some iTunes features have been removed over the years, but I don't know if it's anything that anyone would care about. The "Ping" music social network is gone, for example.
I wonder if we can require software to be open source for the same reason that food has to list its nutritional information and ingredients. Information asymmetry makes a deal unfair.
HP is an egregious cheater, and this kind of cheating is in the DNA of any company that makes its living selling consumables or service at extremely high markups – they do their business at war with their customers.
This is a very strong statement. Asking for high margins puts you at war with your customers?
Yes. Normally in a functioning economy competition drives out high margins, unless innovation is occurring. To protect against competition HP uses things like DRM in printer cartridges. Anti-consumer behavior occurs when your ink cartridges stop working because you didn't use them fast enough and are forced to buy a new set.
I was focused on the middle. "this kind of cheating is in the DNA of any company that makes its living selling consumables or service at extremely high markups"
In a world populated by IOT devices full of software (as discussed previously https://news.ycombinator.com/item?id=15034955), we'll end up in a post-scientific world where the underlying rules that govern a device's behavior are so complex and arcane that we'll have little chance of reverse engineering how basic devices work anymore.
I think in practice it will mean that devices become bricks relatively quickly, and when people realize they have been cheated, there will be a strong backlash: imagine "paleo diet" but for devices.
History repeats itself. Car makers tried to lock out third party parts decades ago. Claimed you only had a license to operate the vehicle, no ownership. Music and movie companies argued against First Sale doctrine similarly.
Courts wouldn't have it. They will stop this too. Digital property will be declared property, not licenses. No limits on resale transfer or rental and the like. Companies will howl like stuck pigs. And it will benefit them, as well as consumers, tremendously.
What I worry about is the regulations and certifications many other industries have to curtail cheating may someday be needed in our domain. I do not look forward to the day that happens.
Can anyone elaborate on his comment on this sort of technology "proliferating to smart thermostats (no apps that let you turn your AC cooler when the power company dials it up a couple degrees)?
I read about a program where owners of some Honeywell smart thermostats will be able to authorize their utilities to turn their AC a few degrees warmer during times of unusually high demand, in exchange for some sort of compensation. But this would be voluntary.
The bigger issue with this, than beeing cheated in the casino, to me is, the increasingly amount of technology around us, we depend on. In a few years robots in the household will be common.
I really want to have my robot servant - but only if he really is MY servant and controlled by ME and not someone else, I do not trust ...
Every year a shocking number of people die from illness caused by air pollution [1]. While it might be difficult or impossible to put a head count on "Dieselgate," the increased emissions certainly killed some.
"While it might be difficult or impossible to put a head count on "Dieselgate,"..."
Well, that was kind of my point. It's no less of a scandal, and BMW still deserves to be punished, but just casually linking their actions to deaths seemed hyperbolic. Assuming the MIT study is solid, it's still possible that BMW's malfeasance has resulted in zero deaths.
I don't mean to distract from Doctorow's article -- I don't disagree with the central premise -- but "Dieselgate killed people" just struck me as casually lazy. It's the kind of probably-technically-right-but-unproven meme that, repeated enough, is accepted without question. We have enough of that in society today.
If you go down that route, you farting contributed methane to the atmosphere, increasing global warming and thus contributed to killing someone somewhere.
So are we going to start banning the passing of gas at the dinner table?
The problem is that it cheated the air quality standards society has set. Our government (regardless of how representative) decided on an amount of acceptable illnesses (regardless on whether you agree with it or not) due to industry in general, and transport in specific, and the company cheated it. The implicit social contract assigns the government the responsibility for deaths due to societal issues like this, once a company violates the law, they have broken that contract hence, they are no longer protected by the government's shouldering of the burden. They killed people.
Common sense? If some quantity of emissions n causes a number of illnesses resulting in death D(n), then n + x will result in D(n + x) illnesses/deaths. So, if 'dieselgate' resulted in some sizeable quantity of pollution increase, then it also resulted in increased illness/death.
The author seems to be asserting that, indeed, that x was positive and therefore people died as a result of dieselgate.
Counterpoint: software such as thermostats that control our energy consumption or phones that lock themselves while you're driving are better if people can't consume in a responsible manner.
Is it nanny-state? Yes, but maybe some people need a nanny.
The problem is that in the nanny state regime, the state gets to define what counts as "responsible". Which really means certain people (those with wealth and power) get to impose their definition of "responsible" on everyone else.
Well, if some people need a nanny, and most don't, how do you make those who need a nanny to buy the version of the product that has the nanny? Either you force everyone to get the nanny version (heavy-handed, especially to those who don't need it), or you rely on people who need the nanny version getting it (which only helps those who can admit that they have a problem).
The only other alternative is something like having some objective event trigger a legal requirement to have the nanny version. For example, a driving-while-texting ticket could trigger a legal requirement to have the nanny version of a phone that won't text while you're in a car.
> having some objective event trigger a legal requirement to have the nanny version
The problem with this is, who gets to decide which objective event is the trigger? There is no real solution to this problem; it always ends up with a hodgepodge of arbitrary rules that benefit certain people (those with wealth and power), but are a net loss for society as a whole.
Well, either the trigger is legally enforceable, or it is not. If it's legally enforceable, then a legislature gets to decide. If it's not, then either the manufacturer decides, or it doesn't happen.
> If it's legally enforceable, then a legislature gets to decide.
Which in practice, at least in the US, means that unelected bureaucrats get to write the detailed regulations that actually decide, based on extremely vague and broad laws written by the legislature. Which in turn means that people with wealth and power can manipulate the system to get regulations passed that benefit them but are a net loss to society as a whole.
> If it's not, then either the manufacturer decides
you make it sound as though this not desirable. To seem it seems likely that career professionals have higher chances of being effective and unbiased than elected officials whose campaigns need to be funded and poses no other qualification/ experience than being adult citizen .
> it seems likely that career professionals have higher chances of being effective and unbiased than elected officials
This sounds nice in theory. In practice the unelected bureaucrats will not be "career professionals" with the proper professional ethics and willingness to be unbiased. Look up "regulatory capture".
Guess I'll keep my Honeywell mercury tilt switch thermostats a little while longer.
As for the phone - if things start going this route, they can keep the damn thing. I'll figure out some other way to communicate, I don't have to have their phone.
Furthermore - in regards to the thermostat setting: If I have the money to pay my bill for what I use, what damn business is it of anyone what I do as long as I can pay for it, and it isn't directly harming them?
Again - they'll have a hard time though bypassing tech that's 100+ years old...
But if the power company says "well - you have to have this to get power from us" - then that just moves me one step closer to going off-grid.
As much as a loathe to make the slippery slope argument, who decides where the line is? What is "okay" nannying, and what pushes the envelope? I think there are other ways of encouraging behavior (e.g., non-linear energy pricing for electricity) that don't totally remove free will from the equation.
But of course the problem here is that the nanny is the corporation, which is just as inclined towards -- and far more capable of -- advancing itself to the detriment of the consumer.
All snark aside, there's often situations where it makes sense to want to be controlled. Coordination problems, in short. There's situations where everyone is better off if everyone follows a particular rule, but people are personally rewarded for breaking the rule. Say, "don't catch all the fish in a lake", or "don't have loud parties late at night that wake up your neighbors".
Following the rule yourself won't magically make other people follow the rule as well. As far as the mobile phone example goes - yes, anti-texting-while-driving software is something I would not install on my phone if given the choice. I'd consider making it mandatory to install it on all phones, though. I'm responsible enough to not text while driving, but whether I make that decision correctly doesn't keep someone else from plowing into me.
https://en.wikipedia.org/wiki/Chromium_(web_browser)#Communi...
Plus, people tend to act more morally when they think they might be watched, whether they actually are watched or not.
Whenever someone refuses to show source code I always think, "what are you hiding in there?" There's usually something.