Hacker News new | past | comments | ask | show | jobs | submit login
The Internet of Poorly Working Things (mondaynote.com)
321 points by kawera on Aug 22, 2016 | hide | past | favorite | 306 comments



Thank you for that article! I have been feeling like that for many months, and whenever I point it out to co-workers or friends, they mention the Apple-universe as the ultimate achievement of interconnectivity. And in a one-vendor universe, the world may be okay right now. Apple did a fine job there.

However, interconnectivity between devices (and software) of different vendors seems to get worse and worse; standards seem to have become irrelevant. Time to market is the only thing that matters and long-term customer satisfaction and durability seem to be of no importance any more. They just don't care about integration with other vendors any more.

When I saw Minority Report (2002) a few years ago, I thought dragging windows and applications across devices with a gesture would be a possibility in the not-too-distant future. Now, in 2016, this not-too-hard-to-develop feature seems almost impossible to imagine. Sharing content between devices is utterly painful or even impossible: copying large files between computers in the same Wifi without going through the Internet; playing a video from your Android phone on a Samsung TV; moving application state from your laptop to your desktop PC when you leave work; playing music from your Android phone in a brand new Audi via Bluetooth ... All of these things are absolutely achievable if vendors worked together or standards were to be developed/followed. Right now, though, it just looks like technology fragmentation is getting worse every day.


I was going to bring this up too, and you got close, but you're just saying it's a lack of standards and I'd say it's something far more nefarious and intentional: brand lock-in.

Brands are what's missing from all the sci fi films where this stuff works, in Minority Report's D.C. Crimelab the displays aren't Samsung monitors hooked up to iPads, and THAT is what makes them work so seamlessly. No vendor wants to allow you to buy from other vendors so naturally the only time you get a seamless experience is when you sell your soul to the devil of your choice (in my case, Apple) and then your iPhone, your Macbook, your iPad and your Apple TV all work miracles right before your eyes.

If I ever get an Android phone, it will never sync up properly like the iPhone. And I just have the garden variety stuff, if you buy any of this IoT crap you damn well better hope that your given manufacturer will be around for long enough to make all the things you want and that they offer good support.

I have a lot of plans to make our home smart once we buy one, but when I do it will be built on open source modules that I can modify and control, and more importantly SERVICE myself when they inevitably break down after a time.


This is capitalism at its finest. Everyone just redoes the same things, occasionally with a slight spin on it followed by heavy marketing.

Noone cares about interoperability, noone cares whether all the data and content we're generating today will be accessible in a few decades or centuries.

Standards bodies are being taken over by company interests and we're still piling stuff on top of a technology stack designed with a ~50 year old technology landscape in mind.

Noone wants to think further than next quarter's profits.

Our industry has gone to shit.


> This is capitalism at its finest. [...]

> Noone cares about interoperability, noone cares whether all the data and content we're generating today will be accessible in a few decades or centuries.

It's not a consequence of capitalism alone. Capitalism wouldn't result in that if its actors were rewarded for interoperability and future-proof access to data. Sadly, customers don't care about these two aspects (or long-term durability of a product, or servicability), so almost no company makes any effort to provide them.

> Our industry has gone to shit.

It's hardly specific to IT/consumer electronics. You have the same with cars, tools, clothing, everything really.


> "customers don't care about these two aspects (or long-term durability of a product, or servicability)"

Customers have no realistic way of assessing those things, especially in a market where all the products are obsoleted every year or so. It's Akerloff's "Market for Lemons" all over again.


A vast amount of the consumers in this case are still woefully ignorant of technology at all, but that is changing, the younger customers are more demanding of tech companies and have d loyalty than any group before, meaning these companies will have to start changing or be replaced by ones using more open frameworks.


It's not that kind of ignorance, it's that you literally can't assess the quality of a product before buying it and testing it. That information is not available. As others have pointed out, UL and CE impose certain categories of pre-sales testing to guarantee certain safety properties. But there's no guarantee that the product will actually work in particular ways and continue to do so.

You can't expect that everyone should, before buying a product, perform their own accelerated-life testing and code security audit.

(I think we've yet to have the statutory rights lawsuits where people's IoT devices stop working after a couple of years and they claim this is a "manufacturing defect", which under UK law must be warrantied for six years.)


I'm not even talking security, that's a whole other dog and pony show and you're right about that, the fact that there's no kind of oversight in that is insane. We make sure car manufacturers install air bags, and identity theft thanks to the garbage mechanisms in place to handle it is almost certainly as damaging if not moreso to you than having to go through a windshield, especially when you consider that the IoT startups can give away your life savings and then just ¯\_(ツ)_/¯ when you ask for their help.

But I agree, there is woefully little information even about what I was talking about which was interoperability, or even just functionality. Pretty much your only recourse is to look into reviews of existing products from a given manufacturer and hope that said review wasn't bought and paid for.


While I'd love this development to be true, in my experience it's the opposite: Younger customers are at least as ignorant as the older generation, if not worse.

And long term loyalty is even less a concern: It's all about the new gadget/social platform/etc. of the day.


You hit the nail on its head. It's our job as consumers to care about SD cards smartphones, replaceable batteries etc. if we really want these.


You all are not seeing the forest for the trees. Using a sci-fi film as the yardstick for what interoperability should look like is silly.

(This is in response to the general angst over interoperability, not the IOT space, which I agree is underwhelming at the moment. However, in what I assume to be opposition to most people in this thread, I am hugely optimistic (if not yet invested).)

Sure, every once in a while something that would appear to be easy turns out not to be, and that's really annoying. But most things works well most of the time. Transferring state between devices? That's the cloud. My email is in sync 100% of the time between 4-5 different computers and devices of different brands and OSs. Dropbox takes care of my personal files, box.com of those for work. I frequently collaborate on documents in both Google Docs and Quip. Not too long ago, transferring files between Mac and Windows was hard (I think macs had a proprietary compression program that wasn't available on windows?) - luckily that's entirely in the past. USB is ubiquitous, FireWire, PS/2, ADB, serial, all gone. A Mac keyboard works on a Windows box, and vice-versa. Bluetooth definitely has kinks, but it also has a lot of "just works" along very long stretches. With a few annoying exceptions, people using Linux, Macs and Windows can work together seamlessly. Websites (with a few annoying exceptions) generally works well in all major browsers on all major platforms, on desktop, tablets or mobile (both IOS, Android and Microsoft). I got a new wifi-printer a few days ago - after joining to the network, it was automatically available on our computers. Took maybe five minutes, no messing about with IP addresses and drivers.

I can cast to a ChromeCast on my TV from my Mac, my Android and random guests iPhones (I can also "cast" Youtube directly to the TVs Youtube app, but it's pretty flaky -- and this is some fancy standard, as opposed to the ChromeCast, it's just that Sony is shit at implementing the standard). I can play music from Spotify from my laptop or smartphone on a Denon speaker in the kitchen, a "GramoFon" wifi-sound-device attached to my "dumb" stereo and my dad's "smart" Marantz stereo.

Interoperability is doing fine, but yes, it's messier than it might have been if some magic omniscient body had come up with clean standards for all this. That doesn't mean that it's not there.


A few counterpoints:

> Transferring state between devices? That's my butt.

No, it isn't. It is for transferring state between instances of the same application (or group of application by the same vendor) between devices.

> Dropbox takes care of my personal files, box.com of those for work. I frequently collaborate on documents in both Google Docs and Quip.

Did you try to make them work together? Oh, you can't really, because Google Docs decided to be cloud-first and you don't have files with actual data on your hard drive anymore. You can't open them in a third-party application anymore.

Here's the thing - seamless operation is getting worse than it was few years ago. A lot of that came from the push to cloud and mobile - even with propertiary formats on the desktop you could work with the files using third-party apps, because the data was actually on your hard drive. Now the cloud services locked your data in, and they're giving you access only through a way of their choosing (which is usually a lowest-common-denominator webapp).

What we see now is a lot of companies trying to commoditize each other. Third-party software developers try to commoditize the platform makers by routing the data through the Internet. So sure, Spotify works and syncs up nicely between your iPhone, Android tablet, Macbook and Windows PC. But why on Earth do I have to use the Spotify app to listen to music, instead of - I don't know - Foobar2000? That's right - because files are not there.

> Using a sci-fi film as the yardstick for what interoperability should look like is silly.

Actually, I think it's very good and sane. It's a perfect yardstick - because we get to ignore all the market forces and imagine how things could work if they were designed to be actually useful. And then we can ask ourselves, why things are not like this, and how to make them more like this.


Your comment quotes mseebach as saying "Transferring state between devices? That's my butt." I think you left Cloud-to-Butt turned on in your browser. That's not always a good move when reading HN.


> I think you left Butt-to-Butt turned on in your browser. That's not always a good move when reading HN.

Indeed I did. Tech news are less bad for your sanity with it enabled.


If you're twelve years old, I guess.


I'm 31 and find it immensely satisfying to read about the latest news and advancements in the butt field.


You're speaking entirely in terms of actions that you can't perform, rather than outcomes. I use Google Docs and Quip because what I want to achieve is collaboration on a text document. None of that requires me to make two work together. That said, I just checked, Quip offers me to export to Word, PDF, Markdown, HTML and Latex - Google Docs to Word, ODF, RTF, PDF, TXT, HTML and ePub. I think I might just find a way.

I used Spotify to illustrate interoperability between brands, but all of the mentioned devices support (many) other sources as well, including DLNA.

> we get to ignore all the market forces and imagine how things could work if they were designed to be actually useful.

Movies show things that are pretty, not useful. It's a flat out cliche that practically anything that happens in any movie (not just in tech, but pretty much any field) looks ridiculous to people who actually knows a little about what's going on.


Spielberg gathered technology experts and futurologists to do the imagining part in preparation of shooting the film: http://www.wired.com/2012/06/minority-report-idea-summit/


dude. in all of these examples, you're the interoperability stack. everything requires interventions from you. it's supposed to be the internet of things not the internet of my things that i make work somehow. goto someone else's house and make all this work without significant amounts of effort. that is what i want. i should just be able to walk into someone's house and say "hey check this out, ok google - play the last youtube video i saw on the biggest screen in the house" and it should work damnnit. that's the future i want


But most things works well most of the time. Transferring state between devices? That's the cloud.

And all we had to give up was security, privacy, reliability, longevity, speed and more money. :-(

Unfortunately, as with so many adverse consequences when IT goes wrong, most non-technical people don't really understand the risks until something bad happens to them, and by then it's too late. In fact, these days with the trend for trying to outsource IT instead of maintaining in-house expertise, even a lot of technical staff don't seem to understand or properly control the risks. Just look at how many businesses grind to a halt every time one of the major cloud services has a significant outage.

The move to Internet-hosted services and subscription-based products is entirely understandable from the industry's point of view: it gives them lots of new ways to exploit their customers and make more money.

However, from the customer's point of view, I think we would be much better off if we invested more effort in decentralisation, standardisation and interoperability, and "private clouds" and VPNs. There are few advantages for customers to having important functionality reliant on a very small number of huge service providers, as opposed to having many smaller providers able to offer compatible variations and having options for self-hosting with decent remote access and backup provisions.

Unfortunately, we seem to have reached a kind of equilibrium now where the huge players are so utterly dominant in their industries that disruption is all but impossible. Their worst case is that they buy out any potential serious threats before they're big enough to become actual threats, but much of the time, the lock-in effects create sufficient barriers to entry to protect the incumbent anyway. There is no longer effective competition or disruption in many IT-related markets, just a lot of walled gardens where you pick your poison and then drink as much of it as they tell you.

I'm sorry to say I don't see any easy way to break the stranglehold the tech giants now have and get some competition and interoperability back into the industry. It's going to take someone (or possibly a lot of someones) offering products and services that are both competitive in their own right and built with a more open culture in mind to disrupt the status quo now, and it's hard to see either startup businesses or community-led efforts achieving escape velocity any time soon.


"The cloud" only works when there is unlimited high speed broadband available everywhere. That's not even remotely close to being reality, and so long as the ISPs in the US are allowed to continue down their current path, it will never be reality.


> when you sell your soul to the devil of your choice (in my case, Apple) and then your iPhone, your Macbook, your iPad and your Apple TV all work miracles right before your eyes.

Even knowing I'm going to hell, one button press to chuck my twitch stream from my phone onto mum's apple tv makes it feel worth it.


Oh geez, right? Or just tapping a show or movie and having the Apple TV show it straight away, or using the Apple TV as a sound output for a MacBook...

This might just be me but I just don't have that good of an experience with android, hence why I switched.


Presumably your appletv doesn't say "Apple TV (100)" like mine does when I connect to it. I've got some sort of virulent strain of discoveryD and it's awful.


This happens if your Apple TV is on the LAN and the WLAN at the same time and they are connected either because they are one network or you use avahi-bridge between them. It thinks the other interface is someone else using the same name so it adds these numbers to make a unique name.


This is a good explanation thanks.


Out of curiosity, have you tried a factory reset? If not I'd get in contact with apples tech support, they've always been remarkably helpful to me and they don't make you jump through hoops of fire to get help.


I did a on the way up to 100, but now it is there Ill guiltily admit that I'm liking the number.


Exactly what I thought when he brought up playing videos from android phones on Samsung TV's. The whole thing works, but only when your TV and phone are from the same vendor


I dunno, even in the ideal Universe of Apple devices talking to each other I have headaches. AirDrop is my main bugbear - sending files (pictures) from my iPhone to my MacBook Air should be as easy as it gets but Airdrop has failed to discover the MBA for the last month or so, there's no direct Bluetooth file transfer as I can see, Mail kinda works as a bit of a kludge, but warns if the filesize is over a certain size. And iCloud is something I'd rather not touch (Apples cloud services can go take a hike, after Apple Music decided that a huge chunk of my music should be deleted forever).

I had phone->computer file transfer working via Bluetooth back on a sketchy old pre-Android Motorola, if such a simple workflow fails then what hope is there for anything more complicated. </rant>

edit: sorry this is slightly OT, it's irritated me for a while and this felt at the time like a good time to share/vent!


I think Apple's problem with this stuff is that they want it to be so seamless that they refuse to acknowledge the possibility of failure. If you're lucky they'll tell you that something went wrong, but they almost never say what. In many cases they just swallow the error silently. I run into this with AirPlay a lot. When it works, it's easy, but when it fails it's a total mystery as to why. Fixing it involves either retrying until it works, or rebooting whatever is handy in the hope that you'll stumble across whatever's responsible.


This pattern seems to be ingrained in Apple at a deep, genetic level.

Their product integration is the most obvious example, but it's far from the only one. Everyone knows that Apple computers "just work", and that when they don't you throw them in the bin because everything is too connected for easy repair.

Even bureaucratically, I once spent two months trying convince Apple that my MBP existed. They swore no machine with that serial number had been made, and flatly refused to service it until I gave them the 'real' number.

None of these things get resolved without forcing a real human to acknowledge the issue you're facing. The one thing I'd say in Apple's defense is that it's mere hubris, which I find far more pleasant than Google's support outlook of "yeah, it's broke, now go to hell."


That is deeply true, and has been for decades. In the old days (90s) the Mac world often smirked at the ubiquitous "Abort Retry Fail?" refrain that Windows users often encountered.

But even then, I noticed that the Apple equivalent tended to be, "...". Not much different today.


I remember there at least being error codes you could potentially look up, back in the old days. Might just be nostalgia coloring my memories.


Turn off all extensions and reboot.


I hadn't thought about it until you said, but that is exactly the issue - I've run into problems on OS X that have been silent/invisible but which I've been fortunate enough to have root-caused by digging through logs. Sadly that's not an option on iOS :(


When I switched to Mac from Windows about 5 years ago (because I decided to be an iOS developer), this is the biggest thing I noticed. There was zero feedback, or a just a quick beep out of the speakers. The anthropomorphic equivalent would be malevolent staring.

It was a tough adjustment.


That's weird to me as a lifelong Mac user because that's what I experience using Windows. My father needed help with something on his Windows machine, so I sat down and double clicked on an app. I got the hourglass for about 2 seconds and then ... nothing. The computer looked like it wasn't doing anything. No feedback. No icon in the dock-like window at the bottom, just nothing. I double clicked a few more times, and eventually the first instance showed a window. I think other instances may have started up in the meantime, slowing things down even more.


AirDrop was always a hit or miss for. I think it's even worse than file shares on Windows which worked me 80% on the time at best. It seems that the main issue here is peer discovery, but I'm still somewhat astounded nobody has managed to produce an easy-to-use service discovery that works.


Unfortunately AirDrop is two things: a file sharing service over Bluetooth and a file sharing service over wifi. It only recently got somewhat unified which allowed you to use it to transfer files between a Mac and an iPhone but it doesn't seem to be quite ready yet.

Bluetooth is not fast and reliable enough for sharing modern file sizes but there is also no guarantee two phones that are near each other will share a wifi network.

And then there are the security requirements.


It's not enough to share the same wifi network either, at least in the recent past the devices had to be on the same access point for aiplay to work.


That's the funny thing when specific issue started the discovery worked exactly _one_ way - the MBA would see the iPhone but not the iPhone would refuse to see the MBA. Puzzling at best!


I'm a relatively avid software user and I'm having a hard time agreeing that lacking such features is a big problem. I have almost never thought that I would need such a feature, and haven't heard anyone else lack it.

My point is that it's very understandable that such possibility would be useful for some people, and that they would very much like that workflow. It's just that if they are a minority - the industry will not prioritize such development, and it's a good thing. This means that they will develop more needed features.

In a way, everything you describe already exists in the form of Web Apps. Log into Gmail on your PC, do some things there, then switch to an iPad and open the same address there. - voilà you have same app on multiple devices. It will even sync flawlessly...

Regarding file sharing - I can only speak for the technologies that I've been using personally, but both Samba, Dropbox, FTP, SFTP bluetooth filesharing and bluetooth sound have all been working amazingly great for me... It seems like a perfect example of a developed and implemented standard which most devices agree on. What am I doing wrong?


> My point is that it's very understandable that such possibility would be useful for some people, and that they would very much like that workflow. It's just that if they are a minority - the industry will not prioritize such development, and it's a good thing. This means that they will develop more needed features.

I disagree this is a good thing. IT is an extreme example of the fact that people don't know what they want until you show it to them. For all you can tell, seamless transfer of files between multiple devices could be a feature people wouldn't imagine living without if they had it. But you can't explain mass complains about lack of such features because people have stuff to do, and they adapt their workflows to the capabilities of the tools they know - not the other way around.


You're right and wrong. You're right in that people don't know what they want. But you're wrong that industry doesn't give it to them - in this case, Dropbox did, and we (techies) said "why would you want that, you can just [X, Y, Z]?" while regular people flocked to it because it was just so much better.

Sure, it didn't include direct LAN sync to begin with (it does now), but that's the kind of "perfect is the enemy of good" implementation detail that the vast majority of people couldn't care less about.


Yeah, I meant mostly to convey that part you say I'm right about :). I.e. I believe that a lot of useful tools appear because some people want to solve a problem for themselves, and only then others discover the value in it.


Doesn't Dropbox achieve exactly that, and in almost the perfect way possible? But yes I get your point, there are surely some areas for which no good solutions exist yet and users don't know that it would be very useful.


Dropbox is great (I've been a happy user for many, many years, and for the last year I've been also a paying happy user). But it solves a different problem - the problem of keeping your files accessible between many machines. Machines you own. The problem of direct file transfer, aka. "I want this file to get from this device to that device (any device - whether mine or my friend's) as fast as possible" remains unsolved.


That's not true, it used to exist in the form of FolderSync, a productized version of Microsoft's SyncToy, which later became part of the Windows Live brand (IIRC as Windows Live Sync.)

Then it went away and never came back. Why? Because Microsoft released OneDrive, a DropBox-like cloud storage system, and the 100% free FolderSync was a competitor to it. Microsoft can make money selling OneDrive, they can't make money selling FolderSync, so it's gone.

Basically, the product you're lamenting doesn't exist used to exist, but no longer does because nobody can make money from it.

I used to use FolderSync to exchange multi-gigabyte video files with my friends while we were doing video editing, and it was an amazing awesome product. It was dead-easy to set up, traversed NAT and firewalls without any troubles, maxed-out whatever internet connection it had access to, and used the LAN connection if possible. Now we'll never see anything like it again.

... anyway, TLDR: the problem doesn't "remain unsolved", it was solved but is now no longer solved.


Dropbox solves entirely different problem - it's my files stored on someones else machine.

Seamless transfer among my machines only is a different problem.


Bittorrent sync?


Or SparkleShare, which is essentially a Dropbox clone backed by git. IIRC it doesn't use anybody else's servers


I agree with most of your post, but:

> I thought dragging windows and applications across devices with a gesture would be a possibility in the not-too-distant future. Now, in 2016, this not-too-hard-to-develop feature seems almost impossible to imagine.

Moving a running process from one machine to another, seamlessly and instantly, is not at all easy on current architectures. Not that it's impossible (well, it might be impossible to do it instantly in all cases), but it would be a hell of a lot of work, even across a single platform, and have a lot of unpleasant snags to deal with. (What happens when important app/system settings are different on your desktop and laptop?)


You assume a specific implementation. In the movie, we only see the display moving from one screen to the other. That is a problem solved since the 80s with X-Window.

More recently, all enterprise application UI are actually web UI. So that's just moving a browser window from one screen to another, so just a refined Apple "Handover".

On a more hardcore fashion, we can move a whole VM almost seamlessly. At some point that could be the case with containers too and that may not be that far fetched to move single process after all.


X-Window from the 80s… Are you trying to give us PTSD flashbacks?

Let’s see how many of these words sound familiar: Xinerama, Zaphod, XRandR.

Not to mention network transparency. So slow, so fragile, so never ever going to support audio or USB devices or video acceleration. I see Microsoft RDP and I weep. I wonder how the Sun Ray protocol compares.

Moving processes is possible, though impractical when devices have different processors. See the enduring appeal of things like Continuum for smartphones. Moving just parts of application state, like Apple’s Continuity, tends to have vendor lock-in and third-party adoption issues. So, similar problems as what a practical IoT ecosystem faces.


> all enterprise application UI are actually web UI

Yes, and so are most consumer apps, and "moving state" to another device is simply a question of IM'ing a link. Sure, the UI in the movie looks way cooler, but it's a movie.


That works for the barely double-digit percentage of apps which store all necessary state in the URL. In most cases, it's more like share a link, reauthenticate, get an unhelpful error page, use the navigation to get back to where you were, learn that the work you did first wasn't saved at all or that their eventual consistency means "same day" (e.g. iCloud), and hopefully they don't have some halfhearted attempt at locking which will prevent you from continuing. I suspect that if this feature ever arrives it'll be streaming rendered video like CarPlay/etc. because that's the only thing the device vendor can count on.

The point isn't that this is uncharted waters technically but that too many companies decided a good user experience isn't compatible with their desired profit margins. In some cases like security and bug fixes that might change due to regulation but that's far from certain and it's really hard to imagine that extending to broad interoperability.


Yes, for an app to successfully transfer state, it needs to be able to transfer state, that's a tautology.

But for webapps that want to be able to transfer state, the mechanism is the URL, and for those app, this works perfectly and unceremoniously well today.


The point I was responding to was your assertion that this was already true of most enterprise and consumer webapps:

> > all enterprise application UI are actually web UI > Yes, and so are most consumer apps, and "moving state" to another device is simply a question of IM'ing a link.

That's a great aspirational goal but it's simply not something which most people can assume will work – I still routinely find apps from major companies where you can't even use the back button within the same session!


Exactly, it's like all already there, working seamlessly, even most of the time free. For the users who really want/need this functionality, it seems to be very cheap to just research those options and configure them already (even if they don't come preconfigured by default on a new OS installation). So maybe it's just not such a sought-after feature?


Xen demonstrated this (for hypervisor-managed running processes) prior to 2005. It was one of the selling points of their virtualisation process, though it turned out that live-system migration had a few additional hiccups in it. Mostly works now.

If you're looking at back-end-mediated stuff, this is pretty much what happens when you synchronise browser sessions across devices, modulo the rate of transfer. The key is managing state intelligently.

This is also done, mostly, on server-side infrastructure, where front-end systems have little to no state on them -- individual client requests come through, state is managed usually in the datastore itself.


You're right. Maybe it's not as easy as I made it seem there. It's definitely possible, if you think of VMware's vMotion (live migration). That implies a shared hypervisor _by VMware_, of course, ... but it shows that technology-wise it's doable. When I saw vMotion for the first time, it blew my mind. Now all I can think of is how cool it would be to have that among different devices (laptop->desktop->smartphone->..).


That's an idea. Wrap applications in a virtual machine and all gadgets are hypervisors. Swiiiipe.


vMotion is cool, but it works on isolated VMs, which is an easier job than migrating an application running on a desktop and integrated with the rest of the system (unless you completely isolate all applications like on the mobile OSs, but then you lose other things).



I recall OpenMosix doing that years ago, but the tech never made it to 2.6 kernels.

https://en.wikipedia.org/wiki/OpenMosix


This is why I am skeptical about the job automation thing. We still can't get relatively easy stuff like this to work.


I don't get it; how can one be sceptical of something which has been happening for centuries? How many scribes copying books by hand do you know? The interoperability issues - which are mostly a matter of politics (in a broad sense), not technology - will certainly affect how fast certain jobs can be automated, but that automation does and will happen is undeniable.


> will certainly affect how fast certain jobs can be automated, but that automation does and will happen is undeniable.

But that's a relevant point. The main question in the automation debate is not whether it happens, the question is whether job destruction due to automation happens fast enough to outpace the usual job creation mechanisms (appearance of new market segments etc.).


You're forgetting the main point (at least how I read it) of the article: Automation for buildings already exists, our office has a ton of the stuff, the thing is the automation hardware we're using easily goes for about $30-40k per room depending what you're doing. Our media and presentation room are about in that number. They work fantastically and in six years have never needed a huge amount of service aside from an occasional tire kick, the problem is Joe Schmo wants that same experience for his house, and it's just not going to happen at the price point that a lot of IoT products are hitting.

The sad thing is I would totally be up for IoT products that costed more, because then I would at least have an idea that whoever built it built it to last instead of with the accountant standing over their head.


I am well aware of how much automation exists. The amount of work we are doing is going up - 40 years ago it was mostly men in the workforce, now both men and women work.


General thoughts, not in the context of IoT:

Interconnectivity is a double-edged sword: it sometimes precludes innovation. To interconnect, you need an agreed-upon spec, which constrains what you can do. If you can think of a better way to do something, you may not be able to implement it.

As an example, IMAP lets you use any client app with any server. But IMAP lets a message belong to a single folder. Gmail, on the other hand, lets you apply multiple labels to an email. Doesn't map well to IMAP. Gmail also lets you star a particular mail in a thread, while applying a label to the entire thread — these don't map well to IMAP. Neither does priority inbox, for example. And so on. Which is why you get a second-rate Gmail experience if you use IMAP.

Standards and protocols sometimes preclude innovation.

I don't want to live in a world where everything is interoperable, because that's a world where everyone is forced to conform to a straitjacket. That doesn't mean, of course, that interoperability is completely useless. It's a matter of balance. I don't want too much interoperability or too little.

The point I'm making is that interoperability has a cost. It's not all good.


There is no economy without consumption and there is no consumption without waste. Incompatibility makes for a great waste.

Edit: The technology industry needs things not to work for it to be profitable. Imagine if things just worked. Dropbox, Box, Google Drive, and hundred other solutions would never be paid for and would not be needed. Waste creates jobs.

Android had beam that just worked so well to transfer practically anything. Recently it fails to transfer photos about 50% of the time on the latest and greatest Android devices.

Btw apple is probably just as bad as everyone else, the only reason things seem to work in the apple universe is because the life expectancy of an apple device is 2 years max, so they can just focus forward.

The best way to transfer files to an other computer if we are on the same wifi is still `python -m SimpleHTTPServer`.

Old things always seems to work better than new ones. If we deprecate the 3.5m jack, we are doomed.


Edit: For the downvoters - I am not saying I am happy about this, but denying it doesn't help anyone.


What's the incentive? Everyone wants a walled garden to maximize lock-in.


IMO there is a very annoying trend in household appliances towards worse and worse UI.

Basically, it started with light switches already decades ago, at least in Europe. In the past, you had switches that themselves indicated what state they are in, so if you cluster them on a board it's incredibly easy to find the one switched on at the moment [1].

Then, some moron came up with switches that don't show anything anymore [2].

Nowadays you're lucky if you get switches on anything at all. This [3] is how a standard stove looks like in new Swiss apartments nowadays. Good luck explaining this to your grandma. You idiots, it has one job, getting more or less hot!

Recently I took a residential elevator that just had an empty touch field when you came in. No indication of what you could do whatsoever. This immediately gave me anxiety, and I'm just 31 goddamnit. Anyways, what happened was that once the elevator door closed it gave me a selection of floors to go. [4] facepalm

Please, for the love of what's holy, stop improving what doesn't need improvement! In german we have a word for this: "Verschlimmbessern". (a combination of verschlimmern='making it worse' and bessern=improving)

[1] https://adventurelightingblog.files.wordpress.com/2010/02/li...

[2] http://www.schulungshandbuch.de/WebRoot/Store22/Shops/627847...

[3] http://media3.siemens-home.com/Product_Shots/915x515/MCSA006...

[4] https://www.google.co.jp/search?tbm=isch&q=schindler+touch&t...


> IMO there is a very annoying trend in household appliances towards worse and worse UI.

I feel like you could drop "in household appliances" from that statement and still be telling the truth. I'd be willing to bet that anyone reading this could think of examples in software where those making product decisions seem to be operating off of these principles:

* Refine relentlessly (vs stop improving what doesn't need improvement)

* Be heavily state-dependent as a way to minimize presentation of options. But don't call attention to state.

* Minimize any affordances. Rely on implicit interaction patterns you assume the user has already learned.

* Minimize everything. Remove options. Hide what you can't remove. The less the software does, the less the user has to think, right?


> Be heavily state-dependent as a way to minimize presentation of options. But don't call attention to state.

This is one of my least favorite design patterns, and I keep seeing more of it. It feels like half the products I use have bizarre state rules with no documentation. Key features disappear as I scroll, or are only available from certain (unrelated) screens, or are under one of six distinct "options" menus in different locations.

Since I spend a large part of my life dealing with this, I memorize the tricks and find it merely annoying. But I still regularly discover features in products I don't use often, buried behind some utterly incoherent state dependency.


My oven has a top element, a bottom one and a fan. Somehow it has seven settings and a temperature knob. It's awful. Thanks Smeg.


>Somehow it has seven settings

With three things that can be switched on or off there are 7 state permutations (plus all off)

> It's awful.

The alternative being three separate toggle switches.

While this would be easier to scan and make a decision if you wanted granular decisions every time you cooked something, 90% of the time the oven will be set to the same setting; ALL ON. I think that having a single switch, rather than three is a more efficient UI for this.


Unless you know how to cook. Then the analog controls are a godsend because they let you control the rate at which heat is applied to your food to a great degree. The setting you might use to boil water is not the same one you'd use to cook an egg. Ditto for sautee versus cooking a steak.

The problem is that the people designing UIs for products don't actually know how to use the products. They don't have any concept of how a user uses the UI. So they just assume that less is better without stopping to consider why someone would want to control the temperature of their stove.


Is be very happy with "all on" and a temperature knob. I have no idea what half the settings are. "Eco"? Really? I should probably read the manual.


My oven has a goddamn menu and requires several presses on a touch screen just to start regular heating. The manual is almost 50 pages.


My oven has a clock that resets to 00:00 whenever there is even the briefest power fluctuation and the oven will not function unless the clock is adjusted with tiny buttons. The stove will work, though.


i have a new blue star range. the oven and burners each have a single knob, one switch for the convection fan, one switch for the light. there are no electronics visible on the unit. no clock, timer, buttons, nothing. the entire thing is made out of iron and steel.

you have to go out of your way to buy this kind of stuff, and it's more expensive, but it's still out there. the low end of the appliance market competes on features and price ("race to the bottom") while the mid and high end compete on component and build quality and reliability.


There is a nice book about things like this, but it is a bit dated. The Design of Everyday Things [1].

Devices are produced not to be useful, but to be sold. People want more value for money so they select a thing with more buttons (seeing a button for a function) or more splashing lights. Usually the cheapest devices of established lot are simple, but may be not energy efficient. Probably the flashiest are mid-tier and the most expensive are sometimes like good looking cheapest option, but more energy efficient.

I am thinking of getting big screen display. What I would prefer is to get computer monitor around 40-50 inches connected to my computer and Chromecast. For audio I would like to use separate appliance. Cheapest most effective solution is probably to buy TV (and it seems all of them are to some extent marketed as Smart) even if I don't intend to use most of it. Other option that I consider is to have 24-30" monitor on wheeled stand so I can easily pull it closer to couch.

[1] https://en.wikipedia.org/wiki/The_Design_of_Everyday_Things


You can blame Apple... they convinced everyone that design was important, so now every shitty company in the world is ineffectually trying to push the boundaries to create something "iconic."

(I only semi-jest... I've seen this process IRL.)


I know where you're coming from, but Apple is IMO much better at modernising UI. They usually don't just change for the sake of change, there tends to be an advantage over the old way. And the new way is often even more tactile than the old. Only discoverability of advanced features is a problem, but that's IMO less of a problem with a device that you use daily.


> They usually don't just change for the sake of change, there tends to be an advantage over the old way

Did you carefully couch your statement because of iOS7 specifically? Gossamer font-faces and weights, "abstract art" choices for icons and palette really felt like "change for the sake of change" to me, an overcompensating reaction to the "skeuomorphism" backlash.


Yep, that's exactly it. Let's say I fall somewhere in between - I found the old time machine way overdone and I also agree that the old floppy disk symbol needed to go at some point. But giving Jony free range was a mistake as well. Apple really is missing a curator and uncomfortable UX questioner with decision power. But what comes from them today is still way better than the competition - with some (very big) exceptions like iTunes / Apple Music.


Fortunately they are back-pedalling quite a bit in iOS 10. Bolder everything buttons that actually have outlines and such.


I believe you are agreeing with me... Apple does a damn-good job of design -- applied correctly. Their stodgy contemporaries' poor imitation is the problem. (Eg. Honeywell -vs- Nest)


Honeywell may be stodgy, but I'll take my ugly Honeywell programmable thermostat over the slick Nest that can get bricked remotely.


I think by "poor imitation" he's referring to this. Would you trust this over a smart device created by a software-first company? http://yourhome.honeywell.com/en/products/thermostat/lyric-t...



Yeah... because Honeywell produced a product that everyone was talking about with their friends and family? Nope. They've been sitting in a dominant market position for decades without any real innovation. Innovator came along and produced a great product; Honeywell went into full patent troll mode.


I don't think we can safely say that about Apple anymore. Here's a nice analysis of UI decisions Apple has made in recent years that are right in line with what the above states:

http://www.nicholaswindsorhoward.com/blog-directory/2016/7/2...


> [3] http://media3.siemens-home.com/Product_Shots/915x515/MCSA006...

I will say this much: that seems totally intuitive to me. You tap the button for the burner you want, then tap the +/- buttons to get your temperature, right? I like having each burner's setting readout all together in one place, as well.

Everything else about it is terrible. Oven controls need to be tactile, they need to be immediate, and they need to not be three inches from a hot pan. That'll cause enough burns just from fumble-fingering in normal use--now imagine that your pan has caught fire and you're trying to select the left front burner and press - 8 times while your hand is showered with flaming grease. Who could possibly have thought this was a good idea?


This is a long-solved problem. Stoves have dials to turn the burners on and select the temperature. Done, solved. This is one of those things that a simple, mechanical control is best.


You need to visit a kitchen showroom. The knobs have vanished and you have stupid bits of glass that don't do anything then go up 3 increments when they finally recognise input.


Well, that's not exactly true. You just need to either buy a very cheap hob(so that touch controls are above the price range) or go for the high-high end where manufacturers understand that knobs are better experience. All Rangemaster hobs have knobs:

http://www.rangemaster.co.uk/products/range-cookers/nexus/ne...


Touch buttons are so much better on stoves than rotating knobs. Its not just a design gimmick. I used stoves with both, and with the touch buttons cleaning the stove after cooking it's a matter of wiping once. With the knobs on the other hand it's really hard to clean, you might need to remove them clean them and replace them.


From your post I assume that the stoves you tried had the rotating knobs on top of the stove (where the touch controls in the above picture are?)

That in fact sounds like a really bad idea, but in the European countries where I have lived, the typical "good ol' knob" stove had the knobs placed vertically right above the oven, not on the top of the stove. Like this:

http://estaticos1.milanuncios.com/fg/1979/73/hornos-en-playa...

http://www.kalea.es/imagenes/fotosAnuncios/15127_12091009085...

So the problem you mention doesn't exist because the knobs aren't something you have to clean after cooking, you're OK cleaning them once per week.

I personally prefer the knobs because stoves with touch buttons tend to beep and turn themselves off when water or oil gets on the controls, which is a quite common circumstance when they're 2 or 3 cm away from a boiling pot. I don't know who had the great idea of designing controls that don't work when wet and putting them in a place that will very often get wet - OK, if you are careful you can avoid it, but I shouldn't need to be careful with that when cooking. Sometimes (e.g. for cooking a big crab) it's very convenient to fill a pot almost to the brim with water for boiling and just dry the water that comes out afterwards, with the touch buttons you just can't do that.

The real problem with these new UIs is lack of choice. I can live with the touch buttons, but my grandma can't, she's 86 and she just doesn't learn new interfaces at that age. Her stove recently broke and we had a really hard time to find another vitroceramic one with knobs, because apparently they don't make them anymore. We finally found an old model somewhere and probably paid it quite overpriced.


I'm pretty sure the only reason those touch buttons are being put on appliances is because they are both much cheaper to manufacture than knobs or physical switches, while at the same time looking modern.


Until you spill water on the stove and the buttons go crazy / stop working until completely dry (which takes some effort)


I do not know if this applies to all stoves with tactile buttons, but the one I have turns itself off if anything hot is put on the buttons. This happens both if you overcook something and boiling water/food pours all over the stove, or if you happen to put a frying pan on the buttons.

But yes, overall I do agree with you. My one major annoyance is that changing temperatures is slower than with a rotating knob.


A rotating knob is a lot more intuitive.


You'd think it's hard to screw up a rotating knob, but they've found ways to do that as well! When I went shopping for a stove years ago I found out that some have knobs that only go one way, so to switch a burner to max, you have to twist it all the way around. Extremely annoying.

I ended up going to a showroom and twisting all the knobs of all the stoves to find out which ones didn't have that limitation. I must have looked like a crazy person, but I got one I liked in the end.

The stove I ended up buying had one variable-size burner, and that knob had a bounce-back switch at the end, so you have to turn that knob the wrong way around to max it, but I could live with that. Why a regular burner would have the limit beats me. The only thing I could think of would be some sort of extremely crude and ineffective child-protection, but all stoves have a child-lock anyway, so what's the use?


The stove where I live right now has knobs whose full range of motion is a quarter-turn, and the flame goes out if you turn it more than halfway down, so effectively you only have one-eighth of the knob to use. If you try to set it to low heat, you'll usually go too far and put the flame out unless you pick up the pan and watch the burner while turning the knob a millimeter at a time.

It's absolutely amazing what people can manage to screw up.


One integration test should be to take the engineer in charge, bind one arm behind his back and tell him to make Spaghetti aglio olio peperoncini for Gordon Ramsey waiting with a Santoku knife ready for some action.

Edit: To clarify - Aglio olio is what they cook in prison in Good Fellas. It's both the most simplest pasta dish and the easiest to screw up - overcook the garlic a bit and you get garlic chips instead of melting their flavour into the sauce.


Used an 80s microwave for the first time in years, a few months back. I'd forgotten how nice they are.

Power knob (the kind that clicks to exact positions), timer knob. Only interface. Set power (or don't if it's already where you want it), turn timer to where you want it. Done. Pull door to open, not even a button for that. 100x better than the interface on modern microwaves.


Move the pan? Turning off the electric burner isn't going to extinguish the fire.


At one of my old work places the two elevator call buttons were replaced by a touch screen with... two elevator call buttons.

Nothing improved, just the same interface with the same two buttons, but now in a touchscreen that was more expensive and can crash.


Reminds me of many of the first VOIP softwares often mimicking physical phones (some didn't even accept keyboard input for punching the number, you had to use the mouse). Applications skinned like photographs of physical items => horrible UX.


And not barrier-free I would imagine.


Blind people can gently touch the wall, find the buttons, and figure out which one is up, because it's over the down button.

Try that with a touch screen.


Right, but if they add or remove floors, they don't need to change the touch screen. Eat that, normal buttons! /s


Heck, a lot of physical buttons also have braille on them


Whenever I see monitors and touchscreens, I keep any eye out for crashes and errors.

It's always interesting to see what should be simple, dedicated-hardware devices taken down by a Windows 10 update or some revealing error message (why does the mall's you-are-here screen use Internet Explorer?!)


My grandmother has a switch like this one [1]. Now the switch is like 60 years old. The scroll under the switch is to dim light, far-right - provides max power to light bulbs, far-left, minimum power, but light bulb still works. Button above it just turns it on and off. It's beautiful, works after decades, shows state from distance, it's practical. I wish it was still sold. Now I need application for mobile device, WiFi to achieve the same...

http://img04.olx.pl/images_tablicapl/358414699_2_261x203_sci...

http://img12.staticclassifieds.com/images_tablicapl/39291169...


You can still buy dimmer switches. Most have a rotating knob, rather than a linear slider, but here is one with a slider (which also looks much nicer than your grandmother's):

http://www.lutron.com/en-US/Products/Pages/StandAloneControl...


> My grandmother has a switch like this one [1]. Now it's like 60 years old switch.

Oh boy, how fugly it is. I had the same xD

Still, it did an excellent job and was almost indestructible.


give that thing a bit of a redesign, put up nice LED lights with a warm color and you have the perfect 'smart' light. In Japan there is a company selling good but cheap LED fixtures with very simple and well built remote controls, temperature setting and three dim modes. It works perfectly with existing switches, you can reset its state by switching on and off twice in succession and the remote buttons are illuminescent. That's smart. Keep your wifi and app interfaces away.


Links?


here you go: http://ohyamalights.com/classifications/olcld-decorative-cei...

for some reason I can only find the remotes in Japanese sources. I suppose they're afraid of English customer support calls...

http://image.rakuten.co.jp/e-akari/cabinet/led/1892240-e.jpg


Okay, so, I program, do sysadmin work, and all sorts of highly technical stuff for a living.

If it becomes a thing where only that oven can be bought, and all ovens have those insane controls?

I'll quit eating. I swear to God, I will give up eating altogether.


I just moved into a new flat, the temperature markings around the knobs that control the hob had been worn away through overenthusiastic cleaning. You know what I did? I drew some new numbers on with a permanent pen. That's the kind of maintenance interaction I want with my appliances, not wiring up JTAG or sniffing packets out of the air.

We need a movement similar to RMS' for this kind of thing.


Like this thread, Dominos also delivers.

Don't give up eating, just give up cooking.


My association with "stop eating" was Soylent rather than Domino's.


My first thought also - Soylent is a nice 'dumb' product for when everything else is too smart to function.


All I want is a gas stove with built-in electric ignition like we have in Japan. It's simple, immediately hot, easy to repair and IMO elegant.


Try induction. As a side skill you can learn to arc weld with a potato instead of a welding rod.


What I showed in the pic is induction probably, most new stoves in Switzerland are. It's ok for directness, but nothing beats gas. I don't even need the damn knob to tell me how strong it is, I can see the flame! There's beauty and fun in cooking with something as simple as fire.


Speak with any professional chef and they will tell you that gas is what you want. The direct control is impossible to beat, induction gets close but it's still not ideal - and regular electric hobs are just miserable to use.


If you don't mind tiny apartments and the smell of hot garbage, move to New York City. Many apartments still use gas stoves


Don't all gas stoves have electric ignition now?


And like that, "Raw Vegan" doesn't sound so extreme any more.


If that happens, just embrace the dystopia and switch to an un-food like Soylent.


People are tasty.


Breatharianism FTW


+1 for the elevator anxiety. In my case there were no buttons inside the lift!! Really, nothing. One had to select your floor BEFORE entering the lift.


That's called a destination control system, and it works that way because of the massive efficiency gains of letting the elevators work out who should go where.


We had those in the Centre Point tower, where I used to work. They were awful. As you swiped through the security barrier you'd be assigned a lift. However, if you missed the number or the barriers broke (which they often did) there was a good chance of getting in the wrong one.

There was a public restaurant on the top floor, so the lifts frequently contained panicky people riding up and down until they eventually got to the ground floor.

Also the lift systems occasionally crashed mid-journey. The lift would stop, the lights would go out and the LCD screen would blank and run through a boot sequence before everything came back up.

Stephen Fry once got stuck in one of them.


Sounds like you had a really beta version? The ones I've used (n=2) worked fine, and the lift assignments were done just outside the elevator, where you'd usually have the up/down buttons. As the elevators arrived, little screens on the outside showed the floors they would be stopping at.

These were private office buildings though, I can definitely see the public restaurant causing havoc with its endless stream of noobs...


I think they're just really old - I have a vague memory of seeing an engineer armed with a 5 1/2" floppy disk. They definitely didn't do anything as fancy as showing which floors they would be stopping at.

The building's now been turned into posh flats - there's no way I'd consider living there (apart from being almost infinitely out of my price range) without being sure they'd been replaced.


This reads like a social experiment.


What's with this? You have floor you're on and a floor you want to go to. If some else presses a button before the lift passes, pick them up too. Awfully conventional, but everyone understands it.


Until you're in one of those office buildings where everybody has lunch at exactly 12, meaning first you wait an eon to find an elevator where you can squeeze in, and then you ding-ding-ding-ding your way slowly down to the ground floor, stopping at every floor to show people that this one's also full!


Couldn't you just use a weight sensor to determine whether an elevator cab is over x% of its rated capacity, and make that one ineligible to respond to external calls until after some people exit?

That wouldn't require changes to the user interface, and the doors would only open if someone can actually get on.

That, plus an assumption that most trips are between the ground/garage floors and one of the higher, occupied floors should be sufficient to reduce congestion without requiring someone to push a floor button before entering the cab. Adding a clock to the elevator controller would also help, such as by stationing empty cabs at the exit floors in the mornings, and distributing them among the occupied floors in the evenings.

I can think of a lot of ways to improve elevator scheduling without changing the user interface, and even if I did change that, I could certainly provide some mechanism to select a specific floor without installing a touchscreen GUI. Wave your employee access RFID badge at the button panel to pick the floor where your cubicle is, for instance.


Its a havoc when there is a crowd, like the floor housing the common cafeteria/lunch place in an office building. In a group of people waiting, inadvertently someone misses out to specify their floor and panics!

Also, surprisingly large no. of people make a mistake in selecting the floor they need to go to (visitors, new joinees...)


Efficiency gains are a big deal, too. Consider that even being able to get away with one fewer elevator saves space on every floor of the building. The taller the building, the more valuable it is.


Link to video describing said system: https://youtu.be/WTIXVS0620Y


Five years ago we moved in an brand new office with this type of elevators. It had bugs but improved over time.

A traditional elevator only know where your are. This type of elevator also knows where you want to go and can optimise for passenger throughput.


Can't they just make it such that there's still buttons inside, maybe given lower priority and some help text explaining that it's best to choose outside?


I like physical knobs and buttons on cars too. Not only are they easier and safer to use while riding due to tactile feedback, but they also work when they're off so you can, for example, turn down the volume before turning the car on.


This is for example still very much with BMW and Audi. And I'm glad they do keep buttons etc around.

The funny thing is that less prestigious brands like Citroen, go all "futuristic", trying to be fancy, and doing so remove all tactile interfaces. It's one of the reasons I would never buy a car like this.


Listen to the latest ATP. Marco explains how his tesla's turn signal indicators don't click anymore when the central console has a problem - they stay silent until he reboots the thing with some button combo.

:-(


That is the worst part of Teslas to me - why the hell is EVERYTHING on this huge display?? Changing temperature or music volume without tactile feedback is the worst thing.


I went to Home Depot's website and sorted range+ovens by price. The 50 cheapest or so (I stopped after 2 pages) all had a knob for each burner.

One had a knob for the oven, but that was maybe notable.

So the 'functional' marketplace hasn't abandoned sense for nice looking stuff with complicated hidden controls.

Siemans still uses knobs on their gas cooktops:

http://www.siemens-home.com/ae/productlist/cooking-and-bakin...

Presumably the mechanical valve works better than an electric one.


If you haven't read it yet, check out "The Design of Everyday Things" [1], which covers just this kind of annoyance, and is a great 101 on any kind of interface design (be it physical or on a computer).

[1] https://en.wikipedia.org/wiki/The_Design_of_Everyday_Things


I'm German and never saw a light switch like in [1] in my whole life.


+1

I'm German, too, and I only ever saw these switches in contemporary settings UIs instead of checkboxes.


They are still pretty common in the US, especially in the older buildings.


I'm almost certain it's an immersion heater switch.


That stove interface is particularly annoying because it takes a single operation with a knob into ten finger presses - on, select element, power up 8 times. I managed to find a stove with direct access to all power levels for all elements at the cost of less cooking surface.


Regarding your first two examples, most new construction where I live (NYC) has switched over to a similar [0] which from my understanding is for accessibility reasons, and I suspect the second example is for the same reason. They don't require any fine motor skills whatsoever, you can pretty much just press in their general direction to toggle.

[0] https://d3jpffnds3bao3.cloudfront.net/catalog/category/cache...


Yes, those are alright - but most in Switzerland don't have a tilt (for indicating state). It's like noone even cares about stuff like this nowadays. In the past I thought German/Swiss design to be some sort of pinnacle of function and design, but not anymore.


There's a reason why Germany/Austria/Switzerland don't usually have state-indicating toggle switches and it's because toggle switches make no sense in multiway switching [0] systems where multiple switches control the same lightsource.

I'm not sure how common multiway switching is in the US, but it's certainly super common in Europe and I guess that's why we use non-state-indicating switches, even in circuits with a single switch, because so many circuits are controlled from multiple switches that there's no point in having them indicate state.

[0] https://en.wikipedia.org/wiki/Multiway_switching


Is there really not a good and robust solution to this? Like, say, a condensator that charges for up to 20 toggles and a tiny motor that actuates all the switches that aren't depressed manually? I'd have this over digital switches any day.


Motor inside screams "not robust", "quick to break" and "much more expensive to manufacture" to me. That's probably the reason I haven't seen anyone making such switches. Damn how much I'd like them though.


The method I have seen for some 3 way switches is for the switch to glow when off. This is used for finding the switch in the dark but happens to show state as well.


There is a solution: put a little lamp into the switch, so it lights up when it’s on.

It’s the best solution that still allows showing state, and is done commonly here.


I think putting a light only is not enough. LEDs at night at home, no matter how weak, are an annoyance. I'd also put in a motion sensor that detects movement around 1.5m from the switch and lighths up the LEDs gracefully. A light sensor to set the strength would also be good. Fallback if any of the sensors are broken: keep them lit on the lowest setting.

IMO you can make things smart, but connectivity is only the answer once you figured out all the other stuff. Otherwise it's like getting the internet on an 8bit Atari.


We have this in the US, but we do it with toggles. Turning the circuit on with one switch and off with another leads to the curious situation of both switches being on but the circuit being dead. It's immediately obvious that this is the case, though.


Isn't that the intended behaviour? Also, on-off state indicating switches don't make much sense in such a circuit. What works is having a small (typically orange-ish) light inside the switches that indicate the state.

Anecdote: We have those in my apartment but they're wired up to be on when the circuit is off. This is implemented in what is likely the worst possible way. We haven't quite figured them out but when off, there's still juice on the line - enough to make some cheap LED bulbs flicker, but not enough to light up a "classic" light bulb or an LED with high resistance. It's great for making electricians cry about the stupidity, as I can take out one of the LEDs in the hallway (when it's off) and the others start to do a strobe effect.


The saddest part of that switch thing is that there are plenty of "stateless" switches around that still indicate light status: http://www.atticmag.com/wp-content/uploads/2014/08/tour-our-...

They work nicely for home automation purposes, since you can set them up so that the LEDs on the side match up to whatever the current light status is (dim to bright from bottom to top).


Not a big fan. I don't want LEDs giving off green light everywhere in the house at night.


When the lights are turned completely off, the LEDs on the switch turn off, too.


Ok. Granted, that's quite smart.


No it's not. I've had kind-of-that "smart switches" in the office. To turn on the light the appropriate button is to be pressed (among 20, it's kind-of a control panel). Once the light is turned on, the red LED is turned on at the edge of the button. So it fits the proposed "turn the LED on when the light is on."

The problem: in the evening it gets dark, you want to turn on the light and come to the dark corner with the panel where the 20 buttons are, and you have no idea which of 20 will turn on the damn light (to add the insult to the injury, it's some in the middle of course, these in the corners are something whatever). And as the punishment, if you hit the wrong buttons you will raise or lower the blinds or change the heating or turn on the fans in the ceiling.

I ask these who installed the system: can you make it having the LED by the button on when the light is off (invert the logic).

"Can't do, the whole module (20 buttons and the LCD display) is made in the factory and can't be tweaked."

A "smart" control panel from hell.

My other favorite comparable bad button is one actual button on the remote controller for one TV-on-computer device. It was probably some generic remote customized by the company for that device. The problem: one button, easy to hit, just blocks the whole device and leaves the software running in the background. Of course, it can't be reprogrammed.


We had something even more frustrating: no switches. Just motion detectors and a web portal. You could only control on/off and brightness through a website if you memorized the room's 16-bit hex code, and you'd only be entrusted with that if you were one of those chosen by management to be a light keeper.

If the light server crashes and the lights reset to maximum brightness while you're the only one in the office... I hope you brought your sunglasses because they're staying like that until morning.


That's not a problem with the basic stateless-switch-with-LED concept; it's a problem with somebody putting together 20 buttons and expecting that to be usable in the dark. It would be no more usable if it was 20 unlit toggle switches in the dark.


To turn on the light, you need something emitting a bit of light when it's dark, not when everything is visible. The major problem with most of the designs are too bright LEDs. They could actually be made to shine so little to be only recognizable in the dark.


That's just bad layout. They should put the switch to control the nearest light on the far left or right. That way you can always turn on the light by the door and control panel in the dark.


It's still sort of a solution to an invented problem, but you could have them wire in a small motion light. Or one of those battery powered stick on LEDs.


I think my solution would be a piece of tape at the top of the control panel, indicating "light switch is directly below". Or possibly replacing the damn thing with normal controls, which I have yet to find inadequate.


> replacing the damn thing with normal controls

The "damn thing" is a "smart" as it controls the light over the network cable -- there aren't even "normal" cables there with which the "normal" light switch would be possible to be installed.


Funnily enough your second example of the switch that has no state would be perfect for my new LIFX lightbulbs (yes, I am aware of the irony of posting this in this thread)

The state of the lightbulb is decoupled from the state of the switch - the switch itself is always on [0].

[0]: Yeah, this isn't great because idle power is >0W. This is where a centralised lighting controller with mechanical switches would help, but that's more invasive than just putting in some new bulbs.


fwiw, stateless light switches are nice in that you can have many light switches controlling the same light. Not saying they're perfect but they have advantages.


If you're willing to lose state feedback anyways you can also do this with toggle switches.


That stove baffles me, if for no other reason than one slip of your finger and you can easily touch a hot burner or pot.


> In less than two years, the CPU inside the TV quickly becomes obsolete and can’t be upgraded while the display itself easily lasts a decade, and software updates are persphinctery, if they happen at all.

That's odd — the word 'persphinctery' seems to appear only in Monday Note articles. Is this some sort of trap street[1] for medium-form articles?

[1] https://en.wikipedia.org/wiki/Trap_street


That looks a bit search'n'replace run amok, although I fail to see what the first half of s/?/sphincter/g could be.

The end result is certainly unlovely though: rather colonoscopacetic, if you will.


I searched the same thing, decided it's probably a pun on "perfunctory" + sphincter-clench-inducing


Is there a rule where everything exists in hashtag form? Just over a year ago #persphinctery

https://twitter.com/search?q=%23persphinctery


The writer probably wished to say "perfunctory" but experienced a brain fog moment.


Any smartphone from the past few years + an internet connection lets people file their taxes, watch videos, communicate by text/audio/video with pretty much anyone in the world, read online encyclopedia, order pretty much any physical good to their door, listen to music, translate languages, read books, take high resolution videos, and so much more.

How do you even beat that? And what do you offer beyond that?

It feels like we're reaching the flat part of the logarithmic progress curve (a much more appropriate curve for progress than the hockey stick curve) when it comes to what personal computing is going to bring to the daily lives of consumers.

Of course, there are still many areas not explored by computing. Computer aided medical procedures and diagnoses, monitoring and upkeep of crops, and so many more fields will grow in the years and decades to come. And improvements to personal transportation, through i.e. self driving cars, is arguably a consumer technology.

But the whole IoT movement is just a parody of itself. The vast majority of people do not want internet connected water cups or juicers or microwave ovens; and those who do soon get frustrated by the real world logistics that come with these things (higher costs, more frequent failures, lack of interoperability, etc.). I had Philips Hue bulbs for a while, and the girlfriend I lived with at the time hated them with a passion - for understandable reasons. When it comes to turning lights on and off, you can't beat a light switch, and the same logic applies to every single item we interact with daily. For instance, the Nest we had in our apartment would randomly start on and off, or suddenly stop being visible to the app, etc.


I'm sure that the idea of the Nest thermostat is noble, but they just work like shit. The one in our office cannot maintain a reliable temperature, and flakes out constantly. We had a fun week where someone went on vacation, and, apparently because their phone app was synced with it, the AC schedule started running in European time, rather than US Eastern...

At some point, you really just want to rip the thing out, slap the old fashioned, bimetal thermostat back in, and hit the Nest with a high-powered electromagnet.


> For instance, the Nest we had in our apartment would randomly start on and off, or suddenly stop being visible to the app, etc.

My own smart thermostat — from another company — has mostly been very good, but it randomly wants me to re-enter my (high-entropy, unmemorisable) password in their site to use it. Why can't I just connect to my thermostat and set up the authentication I want? Why can't I use a client certificate, or an SSH key, or just have a $&% token which lasts approximately forever?

It's my* device; I should be able to do whatever I want with it. Give it (not the vendor's site) a clean API, and I can do anything.


But that doesn't sound like a workable monetization strategy. /s (or not)


There are certainly areas where remote connectivity is useful. Cameras is one case (although I would be skeptical of the security of most of them).

HVAC is certainly enhanced by the ability to control it remotely, first, because of the potential effort savings in being able to control it from anywhere in your house without having to seek out your control thing or remote, and second, because you can adjust the temperature remotely, which is good for those of use who don't want the house at 18 C all day long when we're not home, you can set the temperature to your desired temperature when you head home from work, which may not be a set time.

IOT toasters though? That's just taking the piss.


I could see cases where connected toasters could be useful. It'd be a marginal amount of added value, but it would be there.

The problem is that in the current context, almost any IOT device has an additional negative value attached to whatever positive value its features give, because basically every company doing it makes overcomplicated crap.

In a perfect world, you might be able to plug in an IOT toaster and have it automatically connect to your Amazon/Google/Apple/whatever hub (with a simple "I found <yourname>'s hub, is this correct?" interface). At that point the hub would track the toaster status and the household context, and then do stuff like say, via your phone or a set of discreet speakers throughout the house, "The toast was burning, so I stopped the heat" or "You left your toaster on when nobody was home, so I turned it off".

None of this would be impossible to implement today, but it would require thinking about long-term holistic benefits instead of being able to slap "IoT device" on something to try and sell more to nerds.


But how do you leave a toaster on when nobody's home? You put bread in, push the lever, and it pops up 3 minutes later.

And it's possible for a toaster to detect burning without requiring an internet connection.


> But how do you leave a toaster on when nobody's home?

Start toast, go and grab the mail, get distracted by a neighbor, forget about the toast, go over to look at their new riding lawnmower.

> And it's possible for a toaster to detect burning without requiring an internet connection.

Nothing about the scenarios I described would require an internet connection for anything but communicating outside the house (e.g. phone notifications).


I think the point of the comment you replied to was that almost every toaster automatically shuts off after an adjustable time. Unless you have drastically misadjusted your toaster settings the worst case for a toaster is burnt toast (or perhaps some smoke if you really didn't get it right).


I see your point, but haven't we already had timed thermostats for decades? That would arguably accomplish the same thing with a lower part count and less security or manufacturer support issues.


Timed thermostats have the issue that you have to know what time you're going to be there.

Some days I'm home at 1800, other days I won't get home until 2200, and I often won't know until sometime during the day.

Now, I don't use heating, so it's not an issue for me, but I could foresee it being an issue.

The ability to use your phone as a universal remote is alluring as well. Being able to control the lights, TV, and heating all with one device is pretty cool.


Temote control hvac for vacation homes is a killer feature. Just being able to check the temp in your ski house to do things like see if the furnace is out, lower it if you forgot before leaving, bump it if the weather is particularly cold to protect your pipes, etc are definitely worth money. Otherwise, you're asking a friend to drive there or paying your handyman to go look.


> flat part of the logarithmic progress curve

The curve you are looking for is called the logistic. A lot of apparently exponential curves are actually logistic.


A lot has to do with the fact that most of these devices are novelties. Save for smart thermostats (can save money on heating bill), or simple efficiency-boosting products, life is generally easier without the added complexity of an interconnected-everything. A fridge is a fridge and its proven to work well for over 100 years. A toaster makes toast. The dishwasher washes dishes.

Once something more useful comes around companies will invest more resources but I mean who wants a smart toaster? You pop some toast in there and let it do it's thing. Simple.

We need some major breakthroughs in, for example, 3d printed gourmet meals or something.


> In less than two years, the CPU inside the TV quickly becomes obsolete and can’t be upgraded while the display itself easily lasts a decade, and software updates are persphinctery, if they happen at all.

I refuse to buy a smart TV, and when my TV finally dies, I will probably be forced to buy one, and the first thing I will do is either disable, or just refuse to setup and use, the smart part.

I will then plug a Chromecast in. That cost me $35. That actually works with the services I pay for correctly.

Even those Android TV-based smart TVs are useless, because Google does not control them and cannot force the OEM to push Android updates.... although, Android TV does let you Chromecast to it, which is probably what I would use it for.

Although, if I wanted Android TV, I'd buy an Nvidia Shield TV and use that instead, since it actually has a reasonable amount of horsepower, supports H265 Main10 and Rec2020 colorspace (in the hardware, AndroidTV support of it itself is upcoming), thus, true 4k support (not merely 2160p support using 8-bit Rec709/sRGB, which a lot of so called 4k devices and TVs are; for a historical perspective, see all the TVs that can only do 720p and 1080i but not 1080p, thus are not actually HDTV/Bluray compatible at all, this is the same thing al over again).

Technology advances too quickly for a TV to ever be smart. I'd pay more for a dumb TV that has two more HDMI ports instead, ripping out any smart TV SoC, and ripping out the (extremely useless) cable tuner.

Side note: the cable tuner is useless on cable, due to all cable companies moving to encrypting all channels, or moving to IPTV platforms entirely, thus always requiring a box (CableCard is a dead standard and was a mistake, similarly to how smart TV is a mistake and should be just as dead); the cable tuner is always useless on satellite; and if you're trying to do OTA, many TVs, even ones produced today, do not have sufficient sensitivity to dial into channels for many reasons (distance, obscured line of sight, reflections), and even with large enough antennas, you'll get better performance (and sometimes, the ONLY performance) out of a dedicated OTA box.

As of which OTA box, if you need signal performance for extreme OTA situations, Channel Master's tuners, as a set top box, kind of suck, but often are the only ones that can coherently decode a signal.


> I refuse to buy a smart TV, and when my TV finally dies, I will probably be forced to buy one, and the first thing I will do is either disable, or just refuse to setup and use, the smart part.

Heh. That's what happened to me. Got a "smart" Vizio TV 4 years ago. It had a Skype + camera option. One night saw camera light come on when I wasn't using it. Quickly yanked that out. Then at some point the "apps" for Amazon stopped working for it. So that's when I disabled its networking plugged in a fire-stick and just using it as a dumb large screen for Netflix.

Actually that is an interesting niche to play in -- sell a high quality TV display but just with lots of USB and power ports in the back so people can plug in their favorite stream devices. It would be lighter and thinner as well. More power efficient.

Can even make a play on "this is secure, unlike other such devices". And of course the "you save by not paying for extra crap you don't want to use". Advertise that to a few sub-reddits which do xmbc, kodi development, cord cutters. Maybe get Costco on-board as well.


I just bought a Philips 40" 4k-computer monitor with a Chromecast. Mostly because there is a license cost to own a television in my country. This setup saves me $200-300 a year.


The BDM4065UC? I have the same one and I absolutely love it.


Can you expand on how buying a TV saves you $200 a year ?


I bought a monitor, not a television. There is a $250 license fee for TV recievers.


Getting pictures on the screen via an aerial presumably incurs a tax. So don't connect aerial and use a different system and don't pay.


not sure about his country, but all countries i've lived in, those TV fees are practically impossible to legally dodge.

all possible things that are considered as TV receiver - any smart phone, tablet, PC including all *book, and of course all possible TVs. "reasoning" - you can still watch stuff online.

and there might be sub-fee for just radio receiving, in case you really don't possess any video-displaying device. And again any of those devices (over internet for example) apply.

I don't pay them where I live currently (Switzerland) - the private company which has government mandate to extract fees (Billag) has questionable legality over incurring any fees (which can go up to 5000 CHF =~ 5000 USD), and they need to physically inspect your apartment before any action. Of course there is no force in universe that would make me let them in. Reason - I haven't watched any broadcast TV station (or any other) for last 5 years. Plus we pay for TV channels to internet provider quite hefty sum already (part of the package, unused). If it's not illegal, it's immoral to extract those fees.


> I refuse to buy a smart TV, and when my TV finally dies, I will probably be forced to buy one, and the first thing I will do is either disable, or just refuse to setup and use, the smart part.

You could buy a short-throw projector instead. These are usually Internet-free and have many inputs for discrete media sources, which may or may not have Internet.


> Side note: the cable tuner is useless on cable, due to all cable companies moving to encrypting all channels, or moving to IPTV platforms entirely, thus always requiring a box

Maybe on your market. In Germany, unencrypted DVB-C is alive and kicking, especially for public broadcasting which everybody has to pay anyway. (I think private broadcasters have switched to an encrypted model or are in the process, but I don't care about them because German private TV is utter crap.)


I've got a bunch of WiFi cameras that aren't able to access the internet[0], so really it's an "Intranet of Things". But they sometimes drop their connection, which is expected because it's WiFi, but a lot of the time they won't reconnect properly.

I figure that if basic infrastructure stuff, like automatically reconnecting to WiFi, doesn't even work, what chance do we have of harder things like security working properly?

[0] https://news.ycombinator.com/item?id=11933851


How much did you pay for them? I've lost all faith in consumer-level gear. I just don't think the businesses prioritise any kind of QA any more.


They're definitely consumer-grade cameras, and most of them are at the low end of that. I'd say they are in the $50-100 CAD range per camera.

That said, when they do work they work a lot better than their price would suggest.


I'd probably create a Rube Goldberg-esque setup that detected when the camera dropped off wifi and power cycled (Wifi-enabled socket, it's IOT all the way down) the camera.


There's a device for that: http://resetplug.com


I've seriously thought of doing that.

Then I came to my senses and reminded myself that if these were truly important they'd be hardwired. The cameras are capable, I just have no desire to run the cables.


This is kind of how I do it. I think the Intranet of things is awesome, it's the internet of things that push companies to make crappy products.


Worked on industrial IoT for five years from devices to backends. It's all garbage. Nothing works, everything is wrapped in marketing double speak, security is a Shit show and UX is virtually non-existent.


Do you have any thoughts on why? Lack of competition, race to the bottom on prices, incompetence, or some kind of vendor lock-in?


Mostly the latter two. So Industrial IoT is usually Enterprise sales and price isn't (often) an issue and there is plenty of competition in theory but vendor lock-in prevents easy interop. The big backend software platform vendors often get exclusive partnerships with the hardware vendors which then means if you want to use the device with a different backend you're in for a world of pain. You also have some vendors with massive NIH syndrome so rather than investing in MQTT or COAP they go off and design a new protocol that nobody speaks.

At the same time the software is usually just a mix of bad to meh. A lot of companies have retrofitted themselves as SaaS Industrial IoT outfits after having spent 15 years in industrial automation doing something that sort of looks like IoT but isn't. You cannot just pivot so easily - tech, culture, operations etc. I've seen factory control plane agents (the stuff used to make a touch screens in factories that workers monitor to see stats or operate machines) get retrofitted into massive data collection tools that are completely arcane to modify or extend.


The consumer doesn't want it?

I don't have anything IoT in my home. Why spend hundreds/thousands to: replace my thermostat with an app, make all of my lights remote controlled, motorize my blinds, and what else is there?

It's just spending a lot of money and time for hardly any perceived advantage. Combine this with the fact the average consumer is very cash-strapped in these times.


Motorized blinds should have a simple and incredibly useful killer feature for literally anyone living in a city: the blinds automatically close when you go to bed, and automatically open a little while before your alarm clock goes off.

The fact that getting something this simple this set up takes technical chops and a ton of cash is a massive failure on the part of the companies making these devices.


I was asking about industrial IoT specifically as the parent mentioned experience with that.


> The consumer doesn't want it?

Is that true though? Isn't the massive influx of cash into these proving the opposite?

I see this sentiment a lot. "I don't see the point of smart things, so nobody must want them!" But that just isn't true. There's a lot of benefits to be had, and a lot of people want them.


There are two basic problems. 1) getting the devices to find each other and communicate in some reasonably secure way, and 2) controlling them in some user-friendly manner.

1) is still a mess. Things that are hooked to line power ought to talk over the power line. There are lots of standards for that, from the old X10 (1980s, low bandwidth, poor noise immunity, no security), bidirectional X10, Echelon (1990s, low bandwidth, very good noise immunity, some security), HomePlug (2000s, high bandwidth, some security), plus some proprietary systems. X10 refuses to die, and HomePlug's bandwidth is overkill for lighting. Echelon mostly gave up on the home and went on to become the standard for subway and rail automation (signs, lighting, HVAC, doors, etc.) because of the good noise immunity. They're working on a new approach to lighting, where LED lights run on 48VDC. This is a bit radical for home automation.

If you have any of these, it's useful to have a whole-house RF filter where power enters the house or apartment to isolate your network segment from everybody else on the same pole transformer. These are cheap ($6 or so) but have to be installed by an electrician. This is a big obstacle to power line networking.

Then there are the RF-based networks. WiFi, Zigbee, etc. These have range limitations and may not work through walls. Despite all the headaches of RF networking, most of the IoT vendors are going that way because they get to dump the range problem on the user.


>If you have any of these, it's useful to have a whole-house RF filter where power enters the house or apartment to isolate your network segment from everybody else on the same pole transformer.

I considered X10 a few years back, and when this issue came up I scrapped that idea.

I think it would work better if these controllers used out of band communication, but that would require running an extra set of wires.


Vendor lock-in gets short shrift in this article. Let's not forget Philips attempting to lock out competitor's bulbs from their Hue bridges.


They tried, there was a shitstorm, and they reverted that. For now, at least.

I'm a recent and quite happy Hue owner. I decided to go for the expensive good stuff because I learned the hard way that cheap consumer electronics is always shitty and not worth the hassle - all those trivial annoyances add up to your stress level. Also, I know a bit too much about Chinese cheap LED bulbs manufacturing quality - I don't trust them that they won't burn my house down.


All that build up and it ends up basically an ad for Amazon Echo? Disappointing. Amazon Echo is of very borderline usefulness at best. On its own it does almost nothing that the digital assistant on your phone can't already do, and a lot that it can't just because your phone is so already entrenched in your life.


I can say "Alexa, dim the lights to 50%" and it happens. I can say "OK Google, dim the lights to 50%" and nothing happens. Both ecosystems have access to my lights (I can control them with various apps), but only one has put effort into first-party integrations with IOT platforms.


Ok so your specific lighting system integrates better with Echo than with OK Google. That says something about those combinations, it doesn't say much about Echo itself.

Like I said in the comment you responded to; on its own the Echo is pretty borderline whether it's useful at all, and it can't do many things your phone assistant can.

My point being; just because your combination works with your Echo and not your phone assistant doesn't mean that Echo is a device most people should buy.


I don't think you understand. This isn't the case of "oh that product's better integrated with Amazon than Google". Amazon has put the resources and legwork into establishing a business and development integration with every single major connected device maker in retail stores. All of them, in a way that nobody else has. I think it's wrong to say it "doesn't say much about Echo itself". It shows a focus on becoming the center of the smart home nobody else has.

If a device works with Samsung SmartThings, Philips Hue, Belkin Wemo, Insteon, Lutron, Wink, Nest, HomeSeer, Almond, LIFX, GE Link, TCP, iHome, Leviton, Honeywell Lyric or TotalConnect, Ecobee, Haiku, Keen, Garageio, Z-wave or Zigbee (via a hub)... then it also works with Alexa, out of the box, because Amazon has relationships with all of them already.

You just say "Alexa, discover my devices" and she finds whatever you happen to own on your network on her own, instantly knows their names (that you gave them if applicable) and capabilities, and can address them through natural language.


I do get it. You're saying Echo has the best IoT integration of all the assistants. That may be true. But it's also possibly irrelevant depending on what combinations of IoT devices someone wants. If everything I want has HomeLink integration, then Echo's broader integration means nothing to me.

> This is something Amazon did; "it doesn't say much about Echo itself" is completely mistaken.

You're misunderstanding what I mean by that sentence, which I guess could be clearer. What I meant is; it doesn't say anything about Echo's independent functionality apart from any other devices. Does that help clarify?

My point is that ability to integrate with other devices is great but it's only one thing that a digital assistant has to do. Echo is missing other things that assistants on phones can do not only because the phone has capabilities Echo does not hardware wise but also because our phones are so integrated into our daily lives.

Anyway this discussion has really gone off the rails. My original point was this article is clearly just an Echo ad and it's a bad one at that.


I was very upset about that too. I feel lied to. Moreover, are there hard numbers about how "successful" the Echo really is?

EDIT: how did IOT end up there?


Hard numbers from Amazon? I'm sure an unlabeled graph has been released at some point showing some kind of hockey stick growth.


Yeah... yet another post to promote Amazon Echo dressed up as something else..


> A look at the so-called Smart TV reinforces the observation about CE culture. In less than two years, the CPU inside the TV quickly becomes obsolete and can’t be upgraded while the display itself easily lasts a decade, and software updates are persphinctery, if they happen at all.

Evocative term ("persphinctery"), what does it mean? A web search mostly references mondaynote.


The prefix "per" means through, so this made up word seems to mean "as though it came through the sphincter" a.k.a "crappy".

So "persphinctery" would be a synonym of "excretory", were it actually a word.



When mashing prefixes onto words in a way which is unfamiliar, it's usually customary to use a hyphen. E.g. "post-fact[0]" instead of "postfact". Strangely, to me, the former seems to imply that "post-" is a prefix while the latter seems to be a fact involving postal mail.

[0] https://granta.com/why-were-post-fact/


Given that you put the letters together, it is a word. The only question is how many understand it, and which dictionaries (if any) you can find it in.


Creative word generation is always welcome.


I think it's a corruption of "persnickety," which in my experience means "overly precise" as in a lock can be persnickety because dirt gets in and fouls the workings because there's no room for grime in the lock.


I'll guess that the root term is "perfunctory."


I think it's like saying "extinct": https://en.wikipedia.org/wiki/Perisphinctes


Indeed. Ironically, your comment is now one of the 8 results for me.


First thing I did is google that word, it's rather poetic.

"persphinctery", one word to capture the brokenness of waiting for updates the provider has no incentive to create.

I laughed because it's a very erudite sounding word with a very vulgar interpretation: "persphinctery" == "as they are shat out".


>In less than two years, the CPU inside the TV quickly becomes obsolete

If it's Linux based, this just means that the SoC vendor hasn't kept the board support package (BSP) up to date.

These kinds of things are hardly ever mainlined, so it becomes difficult to update drivers, kernel versions and then even software versions (think X11)


BSPs are never updated, so that's a given. But I think he was more likely talking about codecs (no H.265? too bad) or DRM.


Yeah, I think it's an invented term. It's not in my dictionary as a word.


Given the limited google search results, it would appear to be a very neo neologism.


I just had a chuckle: the Chrome tab I have open now (for this HN story) reads "The Internet of Poo" with the rest of the sentence faded out & cropped. How fitting


Related: Internet of Sh*t is a Twitter account highlighting "your best home appliances ruined by putting the internet in them".

https://twitter.com/internetofshit


It's mentioned in the article... ;)


This isn't the first time gratuitous "technologisation of stuff" has been proposed, though it seems to be proceeding rather apace.

From 1922, "Radio All the Things!": http://i.imgur.com/TSJnicdl.jpg

(Via: http://www.darkroastedblend.com/2015/01/videophones-from-fut... and origionally: http://wi.mobilities.ca/grant-wythoff-aerophone-telephot-hyp...)

Or the original book if you'd prefer: Hugo Gernsback, Radio for All (1922): https://archive.org/details/radioforall00gerniala

I think I know what a radio phone might do. Radio clocks even exist (though a networked NTP timekeeper is more useful). I'm not sure I'd care to fly in a radio-controlled airplane. And I'm utterly perplexed at what a radio heater might be -- unless it's a microwave space heater, or an early instance of Nest.


The only thing in that picture I want is the radio business controller. I would set the dial to BUSINESS and wait for the money to roll in via radio.


> I think I know what a radio phone might do

It's just a mobile phone.

> I'm not sure I'd care to fly in a radio-controlled airplane.

They're unmanned and called drones.

> Radio clocks even exist (though a networked NTP timekeeper is more useful)

A radio clock costs few cents and works pretty much everywhere. With NTP you need to have a full IP stack and some internet reception.


Your full IP stack and Internet reception will set you back about three-fiddy: https://www.amazon.com/ESP8266-Remote-Serial-Transceiver-Wir...


So I need to enter my wifi password into my clock and need to hope that it has regular access to my wifi?

Also you can get a whole radio clock for around the same money.


I admitted to the existence of radio clocks in my initial comment, you don't need to convince me.

I'm simply stating that the cost of a full IP stack is exceptionally minimal, and getting smaller. That's kind of the reason why we're ending up with the Internet of Shit / Internet of Threats.

Of the whole list of things from Gernsback's illustration, it's probably the most sensible.

The other lesson -- to not get fucking bogged down in minutia and definitions -- is to realise that his "radio world" really boils down to a few distinct elements:

1. A communications medium. Gernsback uses radio. We'd probably generally use Internet, though that can run over radio links (packet radio is a thing).

2. A controlled system.

3. Some sort of control mechanism itself.

4. Feedback and response (it's not much good to send signals to an RC plane or boat if you can't monitor its location and surroundings).

5. Other facilities. The "radio phone" for example has that element that's central to most messaging and social systems: a directory. Databases, billing systems, etc.

Thing is, we're living in Gernsback's world, we just don't consider it to be a "radio" world, and to a large part it's not, though radio can and does act as part of the media linkage. The real business opportunities though seem to lie elsewhere. Mostly in advertising. Or pyramid building.


> I think I know what a radio phone might do. Radio clocks even exist

I assume a radio phone would be a walkie talkie, and those can still be useful. Further, I used to have a cheap mobile phone that had a built-in FM radio, but now I have a much more expensive smartphone that doesn't -- or at least, it's not user-accessible.

I also have a somewhat old radio alarm clock that gets the correct time via the mains power supply. Needless to say, the time goes wrong and you have to turn it off for a while then hope it gets the right time when it restarts. Usually it does, but sometimes it can be hours wrong for weeks.

I could use my internet radio as a backup, of course, but the broadcast sound arrives way ahead of the internet sound, and if the two radios show the same time as the smartphone, that's looks like luck.

Seems to me we've been screwing these things up for a while, even without the internet of things ;-)


I like how the radiofax is dumping the output straight into the trash can


My guess is that it's some sort of ticker service.

Also, in an era of omnipresent paper notes, perhaps the expectation was that most notes would be discarded, with only the important ones retained?


Great catch. Thx for the LOL!


Working in consumer IoT on a regular basis.. here is the main problem with adoption: Every device has it's own app!

Let's say eventually I want to use Echo, or SmartThings, etc. I can buy 20 devices but half of them have their own app that requires registration before I can connect it to my Echo or SmartThings hub.

The reason people hold up Apple as a potential solution is you are seeing hints of them starting to steal registration tasks of even cloud connected devices.

So you might ask "Why" does each device have it's own app? Because all of these companies want to "own" the customer and the associated data.

The funny part to me, is I know damn well some of them don't want any part of maintaining an application and the team associated with it and the infrastructure.

If someone came up with a platform that was easy to integrate, managed an app, and just shared that data for free or a nominal fee, I believe they would win.



Whenever you see "Internet of Things", think "Unfixable Heartbleed Everywhere Forever".


It's true that progress is slow, partly because not enough people are tinkering and innovating in this space. We're trying to fix that by getting affordable starter kits into the hands of more developers.We saw a big gap in the market in terms of innovation at the edge especially using cellular technologies. Technologies like CAT-M and Narrowband should make this gap even smaller.


Honestly it's not that hard to design controllers. It simply requires a fundamental understanding of finite state machines. Amateur / rushed coders think the if else statement is cheap way to program in a ruleset and often end up with rare corner cases as a result. The key is to define your states, and don't ever let it ever get outside of what is expected. But if it does, make sure you can detect it and revert. This isn't rocket science. A raspberrypi is faster than computers 5 years ago, don't mistake it's price for quality.


And then the cloud service for your door lock, light bulb or security system gets shuttered.

#internetofbricks


The internet of low margins and devices of dubious functionality. I'm truly scared that the next dotcom crash will come from iot not panning out for companies jumping into it without any forethought


Even though I compleatly agree with everything being said in the article ,it highly understimate how hard it actually is to build a semi-complex IoT product. Our startup realized this after 2.5 years in the making of a Smart Things competitor. For example, to debug something you need to check a larger set of things (cliente, backend, hardware, embedd software, network, etc). It will all get better but as of today it really is hard to achieve.


Regarding the mentioned Moore's law, do we really have 1000 times more computational power on consumer devices than we had in 1996?


A quick look at Wikipedia ("List of Intel Pentium microprocessors") shows that top-of-the-line desktop CPUs ran at 200 MHz, while a current Core i7-6700K is clocked at 4 GHz. That's already a 20-fold increase. Also consider that clock rate is not everything. The number of instructions per clock has also risen, although I don't have hard numbers at hand that easily.

Also consider storage. RAM clock speed is slowly, but steadily increasing, as are cache sizes across all levels. And the biggest gains in this area are made by transitioning from HDDs to SSDs, although I don't have numbers on this one, either.

But the elephant in the room is floating-point performance. The GPU in my desktop PC (a 2015 AMD R9 Nano) does 8 teraflops. Compare with an Nvidia RIVA 128 from 1998, which did 5 gigaflops. [1] That's more than 1000 times faster in just 17 years.

[1] http://www.nvidia.com/object/RIVA_128_FAQ.html


JLG raises some excellent points and points to some excellent critics, including the Internet of Shit twitter account.

Some elements to highlight:

We had this problem in consumer electric goods, starting slightly over a century ago. Poorly-made devices could electrocute users, burn down homes (or offices or factories), would fail to work as advertised, fail to work, or otherwise disappoint. The result was that, at the instigation of insurance companies, an independent testing and certification service was created, Underwriters Laboratories. UL are actually looking at entering the IoT/IoS morass: http://www.csmonitor.com/World/Passcode/2016/0405/Can-testin...

More complex products make for more complex assessments. One of my most useful and oldest household purchases is also one of my oldest: a cast-iron frying pan bought 30 years ago. It functions as it did when I first bought it, I've used it daily for decades, and its function was immediately evident. Contrast the tablet and keyboard on which I'm typing this, neither of which have lived up to expectations (nor the manufacturers to their warranty obligations -- Samsung and Logitec respectively). With hardware, control interfaces, communications, server dependencies, bugs, security vulnerabilities, and more, IoT devices are vastly more complex than what they replace.

Poor margins make for poor products. This seems far less well understood than it ought, particularly among the HN crowd. An undercapitalised company, or one operating on a burn-rate and prayer, or some overseas firm you've never heard of, trying to crack your local market, may well choose to skimp on quality options. Vendors who care nothing other than order fulfillment and logistics (Amazon, WalMart, Best Buy) have no vested interest in product quality. Quality itself is a difficult metric to assess, particularly lifetime quality.

Product / vendor lock-in is a consequence of this. Rather than make money on initial purchase, various forms of subscription-based services are offered, or interconnectivity is provided ... as an ad-on, additional-cost, feature.

Unintended consequences are a real bitch. This is an area of market failure I've been exploring which seems highly underconsidered in contemporary economics. While Akerloff's "Market for Lemons" gave us a fuller awareness of information asymmetries, I'm not aware of a general treatment for a simple inability to know the salient future outcomes of current actions. Consider Thomas Midgley, Jr.'s contributions of chlorofluorocarbons and tetraethyl lead, used in leaded gasoline. He's done more than any other engineer to put all of humanity at risk. And while lead was a known pollutant by the 1920s, the effects of CFCs on the ozone layer wasn't discovered for another 50 years. Long-lingering systemic effects are particularly pernicious.

"Smart" products only offer so much additional capability. The ability of added logic or communications to a device to increase value is limited by the mode through which that object operates. Take Google's recently announced energy savings AI applications. The multi-millionfold increase in computer processing capabilities, combined with a decade's hard work at improving AI ... can deliver a ~15% energy efficiency improvement. In a Moore's Law scenario, this is about a four-month advance in processor capabilities, and given interactions of the Jevons Paradox, the end result is likely to be increasing energy utilisation, not decreasing it. Similar results are seen for automotive and aircraft control systems. Net increases in efficiency from millionfold processer capability improvements are, at best, a factor-of-two doubling, and often far less.

There are examples of marked improvements, with the multiple factors compounding in shipping markets a good example: containerisation, standardisation, improved dockside handling, intermodal rail and trucking, invoice and manifest controls, etc., have increased the efficiencies (and decreased the costs) of shipping markedly. Of course, the result is that far more goods are shipped, and labour rates in advanced countries have fallen tremendously with falling negotiating advantage.

Complexity increases failure modes. The more interconnections and mutual dependencies a thing has, the more ways it can fail. Any individual or combination of possible interactions needs be considered. Multiple systems with complex interactions compound this. The alternative is fully modularised and independent systems.

E.g., rather than a fully integrated "IoT" refrigerator, specific sensing, control, and communications modules might be provided to monitor power, temperature, contents, and communicate these.

Taking my tablet example -- apparently cases designed to specific device dimensions make the interchangeability of what ought be completely separate systems complex.


It's adding a very complex and complicated machine (a computer) which also interacts with other very complex and complicated machines all over the world to items which are usually very straightforward and "simply work". To provide an equally seamless, secure and whatnot "smart" experience you have to invest a lot on top. If the additional value of the "smart" product is that low no one pays the price that is needed, vendors operate on poor margins and there it is, the internet of shit.

On top of that you have hardware people being responsible for software which usually is equally bad as software people beginning to solder. It might kind of work in the end but….


Technology is no longer driven (=funded) by innovation but by the ability to produce "recurring revenue". Selling software and some support around it is no longer a viable business model. It's unicorns or nothing.

Even Microsoft has seen that coming (and responded by open sourcing gradually ... from C#, ... MS Azure, ... the current Windows-10 beta that runs a Linux kernel, to PowerShell that runs on Linux). Has hell frozen over? No. But we think the only way to make money is with reselling user-data. And the VC industry reflects that. How many tech businesses that made it big today aren't built on central data harvesting of their users in exchange for "free" stuff?

We've pushed all our computing to the cloud and think it's the panacea to everything. Do we really expect to apply the same business model and technology principles from virtual world to our physical devices and get away with it? Reminds me of the old phrase, "When your only tool is a hammer, everything looks like a nail".

Problem isn't that Security is so much worse in IoT than in your typical web application stack. The problem is that it isn't any better than web-security! We have XSS and SQL injection in IoT, crypto built on shitty javascript, we have MiTM attacks, lack of authentication, ... worse, we can re-use the same exploits (shellshock, heartbleed, ...), and nearly identical attack-vectors! In the age of Shodan and MASSCAN we won't get away with that [1]. (ProTip: (from Gartner, I think): you can send payloads with MASSCAN to a gazillion connected devices by 2020 ;-))

Take a look at the issues of trust on the web! How many signing countries can we trust in our certificate chain? How confident are we that our HTTPS connection is safe (cloudflare is known to MiTM[0] half the internet, so the silly green browser padlock doesn't mean anything) Yet we expect to use the same flawed trust-model with IoT (where bugs hit us in our physical face) as we use on the web. We want to protect all this with Tor browsers, even more centralization and vendor-lock-in or even creating dystopian proposals like the ones from many EU based countries (hello Germany) that propose that the data will never leave the EU.

The vendors response?

I, and many others who stepped forward to report critical bugs to IoT companies are ignored, accused, blocked. I started IoT Security[5], on LinkedIn some years ago. LinkedIn is a joke of a platform I agree, but it also allows us to put all this shit right into the face of these crappy vendors. It's the perfect melting pot for marketeers and engineers ;-))

Privacy and Security have gotten worse over the years. not just for IoT but on the web in general. We point the finger to the "Internet of Shit", but do we expect if we constantly use the wrong tools for the job?

We're looking to the industry to solve this for us. An industry which hasn't figured out yet how to monetize their technology without selling our private data. VC's are as much part of the problem as tech companies. What could go wrong? It's like asking an addict to seek cure by discussing it with their dealer.

There is too much technical debt in terms of privacy & security that all these gadgets are either going to kill us, and as consequence will be regulated like anything else that has killed regularly in the past. If not now then as soon as the first connected car is weaponized, or the first smart-home kills someone.

Nation states and their armies have become dependent on the Internet to keep us safe. They're doing a good job reminding us that the cyber-threat is real. They're right about the threat. What we're wrong about though is believing that global data harvesting by shady intelligence services will keep us safe. But they have their own agenda[4].

The way forward? Certainly not more centralization or cloud.

PS: If you're also sick of all this shit, please do find me ;) there is lot's to talk and little time.

[0] https://scotthelme.co.uk/tls-conundrum-and-leaving-cloudflar...

[1] https://media.defcon.org/DEF%20CON%2024/DEF%20CON%2024%20pre...

[2] https://twitter.com/ValbonneConsult/status/74958341227350835...

[3] https://twitter.com/ValbonneConsult/status/74962796346176307...

[4] https://blog.valbonne-consulting.com/2016/08/06/smartcitiess...


> or even creating dystopian proposals like the ones from many EU based countries (hello Germany) that propose that the data will never leave the EU.

I could call this proposal many unpleasant things, but "dystopian"? I can definitely see why people want clearer jurisdiction over the flow of their personal data, and that intention is anything but dystopian.


how would you guarantee that packets never leave a country without balkanizing the net. It's just another proposal for more centralization and walled gardens. "E-Mail made in Germany" was all the rage in 2013 after Snowden.

http://www.thelocal.de/20131129/german-email-providers-unite...

It's an absolutely ludicrous idea. A wet dream conceived in the boardrooms of Telekom and 1und1 during a time when nobody had an answer to reduce blanket surveillance. It's dystopian not in a sense that it would be impossible to implement but because it would be a knee-jerk reaction that wouldn't solve the underlying issues.


waiting to see what comes out of Google IoT project like Brillo, Weave it's been kicking around a Google for a while now seems long before the 2015 I/O announcement. maybe Fuchsia has something to do with it who knows.


My neato botvac seems to forget its wifi settings every couple weeks :(


I just call it IOJ: Internet of Junk.


Its real name is FAD.


IOS: Internet of Shit. As JLG notes in the article.

Io[B]T[TSoY]: Internet of (Broken) Things (That Spy on You).


Internet of Targets.


Beautiful!


Nothing about the impending IoT seems planned, standardized, and remotely designed with the consumer's best interest in mind. That being said, consumers for their part seem almost eager to throw their money away for truly dubious improvements. What isn't a bad idea from the get go is often a good idea ruined with careless implementation.


Apart from the fridge that, once upon a time could show you your Google calendar[0], I have yet to see any IoT device that actually requires a cloud service for its functionality.

Why can't things be built to just use local TCP/IP connections? Those work just fine on my local network, and they work just fine if I flip my VPN on when I'm not on my local network. Anything that actually needs some kind of server piece should have a server that can run locally.

[0] https://productforums.google.com/forum/#!topic/calendar/Uhfp...


> Why can't things be built to just use local TCP/IP connections? Those work just fine on my local network...

Simplicity, possibly. Vendor lock-in, probably.

I'm in the same boat. I wanted a weather station that didn't cost several hundred units of currency but also let me query it directly instead of having to hit Weather Underground or (FSM help us) a vendor's web site with no API. Only a handful of them do this so I wound up rigging up what I wanted using a Raspberry Pi, a USB-connected weather station, some extra sensors, and several lines of PHP.

Why I'd want to subject my light bulbs or coffee pot to that is beyond me.


Data-mining.

How can the smart-fridge company sell your eating habits if your fridge only tells you (and not them) how you're using it?


There are tons out there, no? Amazon Echo is a stark example, but e.g. nest camera seems to as well.

https://nest.com/support/article/Do-I-need-Wi-Fi-to-use-Nest...


The Amazon Echo makes sense to require an Internet connection.

There is nothing that the Nest cam does, apart from offsite backup, that requires it to contact the company's servers. Everything else can be local only with VPN for remote access.


P2P discovery and connectivity within homes is pretty spotty so it's more reliable for all devices to initiate connections to the cloud.


Any open (as in secret-less) P2P discovery protocol will eventually be hacked and used by people not-you.

I just wish there was an open standard, secure, home device discovery and management protocol.


I would imagine it's the same thinking that puts multiplayer in every game, and just generally wants to keep a live connection to everything, always.


This post would be more accurate two years ago. Today IOT is usable with much less work and headaches. Sure there are still bugs, but it's a big improvement from a $40 brick. Also platforms like IFTTT, Alexa, and Homekit make it more practical as well.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: