Hacker News new | past | comments | ask | show | jobs | submit login
Why We Are No Longer Developing for the iPad (ipbhost.com)
406 points by mason240 on April 16, 2015 | hide | past | favorite | 489 comments



I think that becoming the most profitable company in the world has negatively impacted Apple, at least how I view them.

I wrote the chess program given away free with the early Apple IIs, made some good money on a Mac app in 1984, and until recently favored Mac laptops: a fan boy. That is changing. I bought an incredibly cheap and underpowered Windows 8.1 laptop in January (HP Stream 11) for $199 and like it quite a lot. I plan on getting either a Surface 3 or Surface 3 Pro soon, and will put my old MB Air on the shelf next to my Linux laptop to be used just as needed.

Microsoft seems to be heading in a good direction, Apple less so. Things like living in a web browser and a platform neutral IDE (I use IntelliJ for Clojure, Java, Python, PHP, and Ruby) make OS platform less impactful on using a computer. I am looking at Surface 3 Pros and Surface 3s, price and feature wise, vs. Apple products and I find Apple losing ground. The comparison is especially bad for Apple re: cloud services: Office 365 is such a good deal for $100/year compared to iCloud, with way more and better services.

As software engineers, money for hardware, software, and cloud services might not mean much, but I think it is good to look at what the general public will be using.


> That is changing

Recently I challenged myself to learn more about USB, HID devices and kernel development, so I partially reverse engineered and wrote a driver for the Xbox One controller. This is great and all, but it needs to be signed for people to use it without lowering a level of defense. Turns out you don't just need to be an Apple Developer, you also need to make a special request and regenerate your certificate, which needs to have a special extension, else it'll be rejected by the kernel signing system. Not the least worried I pony up 99€ (not a dev), submit a request, and an automated mail tells me I'll have a reply under 7-10 business days. So I wait 2 months, and, slightly worried since I have no reply yet, I submit again. That was 13 days ago[0].

I used to think that GateKeeper was great technology and that things will be okay. Now I have doubts: does Apple really care about its developers? How long before the kext-dev-mode boot arg disappears and kernel space becomes inaccessible to mortals?

[0]: https://github.com/lloeki/xbox_one_controller/issues/2


> does Apple really care about its developers?

Yes, of course, or you wouldn't have the ability to do it at all. But Apple cares more about it's users and the experiences they users have, and things like GateKeeper help them sell users on a safer, walled experience. As an Apple developer (in that I've paid for a license, not Apple-employed) with parents and a grandparent who have a Mac, I much prefer simply telling them to only install stuff from the App Store and not worrying about downloading and running anything from the general Internet.


I think it's not just that the dev backdoors will go away, I think the whole OS is going away. It's pretty obvious to me that in the long run only iOS will remain. It makes zero sense to keep two semi incompatible operating systems around, and it's just the sort of radical move apple would do. Yes, there are a gazillion reasons to keep OS X around, but knowing apple none of them will matter enough.



I believe this is similar to Microsoft's driver model, where at the end of the day, they decide what's signed or not and thus what can hook into the kernel.

Stallman was prescient.


UEFI's secure boot enables Microsoft to force out other operating systems in hardware:

https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_In...

(+1 for the RMS reference)


> Things like living in a web browser and a platform neutral IDE (I use IntelliJ for Clojure, Java, Python, PHP, and Ruby) make OS platform less impactful on using a computer.

Really? I'd think this would make the remaining things matter more. Things like:

- International text input: in OS X, out of the box, é is opt+e, e (or the now even easier: hold down e and choose é from the menu). In Windows, é is alt+0233 or something, or change your keyboard layout and relearn how to type apostrophes. Everything else related to international text input is also much nicer in OS X.

- Trackpad quality: I've yet to find a trackpad as nice as the ones that come on Mac laptops

- emacs keybindings in every text box! Anyone? ...Anyone? Maybe it's just me.


Not just you — love the Emacs bindings everywhere. It always throws me off on Windows when I'm trying to C-n and C-p around, getting hit with new document and printing dialogs all over the place :)


One more for the keybindings here (actually, they're more like readline keybindings -- C-h being the biggest difference). I would rather like to know how to set up those keybindings pervasively on graphical Linux applications. (The terminal and Emacs already have them, of course, but not the browser.)

On a related note, I like the command key being solely responsible for most GUI application commands (so it doesn't conflict with the above), and would probably also want to incorporate that into an ideal Linux setup. I wonder if someone makes a Linux distro with this sort of thing preconfigured.


You can set emacs keybindings in Gnome apps. I normally do it when I get a new box, but always forget how. A quick google normally gets me there: http://askubuntu.com/questions/354489/emacs-gtk-keybindings-... Like this link, which apparently suggests it doesn't work in Gnome 3.6.


What I find intolerable on a Mac is the behavior of ctrl+left/right. I expect it to jump to previous/next word and nothing else.


That's option (/alt) left and right (and has been, IIRC, for 20 years)


I use a Windows keyboard with Mac in the office, for me the ALT key replaces the CTRL key.

I don't prefer it, but it works.


I just swapped the two and it feels more natural! I prefer having the cmd key next to the spacebar.


The default Apple supporter answer would be: "Apple has had these bindings for longer than the defacto standard has been around", but Apple would do well to get with the program and make the defacto standard as used by the rest of the computing world the default.


I use xkeymacs on my win boxes and I'm quite happy with it.


Woah, that's really cool! Thanks


I'm a long time Emacs user, but I certainly am not aware that OS X supports Emacs key bindings in every text box. My initial reaction is to not believe you, but I will certainly try it out tomorrow.


> OS X supports Emacs key bindings

It's a NextStep thing. You can find the complete list in /System/Library/Frameworks/AppKit.framework/Resources/StandardKeyBinding.dict. You can customize them in ~/Library/KeyBindings/DefaultKeyBinding.dict. The Cocoa API documentation has the details (https://developer.apple.com/library/mac/documentation/Cocoa/...). There's a graphical editor for the DefaultKeyBinding dictionary (http://www.cocoabits.com/KeyBindingsEditor/), but I've never used it.


C-a, C-e, C-n, C-p all work.

EDIT: C-f, C-b work as well.

Are there any others?

Actually, I can tell you for a fact, I've hit M-b (∫) many times and been disappointed by it. I get why, but I'm still disappointed.


Only C-something works, no? I'm also still trying M-b / M-f regularly (with either Option or Command for Meta, which has surprising consequences every single time.)

C-t works though, and that's awesome to correct typos.


Ah, so here's the trick. However I have my modifier keys set up for my external (non-Mac) keyboard means I can't really use them. I use my Caps Lock in Emacs for the C modifier, but in OS X my Alt key is the one that works in normal text inputs.


My favorite one is C-t (transpose)


And of course there's kill C-k and yank C-y


But sadly, no C-w.


I've never used emacs - Is that 'select word'? I have that system-wide by putting

  "^w" = (selectWord:);
into '~/Library/KeyBindings/DefaultKeyBinding.dict'. There's a load of information online about the customisation that can be put in this file. Here's a comment of mine with some links in it: https://news.ycombinator.com/item?id=6718604

Annoyingly, Xcode stopped paying attention to this file at some point around version 4, so now you have to maintain a separate version for that.


It is actually pretty common knowledge that OSX has Emacs keybindings by default for text entry. (At least the basic text editing bindings, don't expect a full Emacs implementation and "M-x tetris".)


Note that the emacs keybindings even work on iOS. I develop a coding app on iOS and when I use it with a bluetooth keyboard I always take advantage of the emacs-inspired shortcuts.


> - International text input: in OS X, out of the box, é is opt+e, e (or the now even easier: hold down e and choose é from the menu)

With Linux and Gnome, you can have several keyboard layouts, which are very easy to switch between (much like multiple languages on a phone keyboard). Usually I use the English International layout, one quick change (accessible from the global bar) and I've got all my accents (also speak and write in French) easily accessible. A bonus is that there's different international layouts if one doesn't suit you. é is simply ' + e. ç is alt , + c. And so on... Éïôàç

I'm sure OSX is a nice operating system, but Linux has very usable desktop options these days too. Windows, not so sure...

> - emacs keybindings in every text box! Anyone? ...Anyone? Maybe it's just me.

I think this one is only you...


International text input is the kind of thing that many Americans won't need too much.

I love the (limited) Emacs bindings. C-k is probably my all-time favorite.


> International text input is the kind of thing that many Americans won't need too much.

No, but as a non-American, I'm glad that free software has us covered...


> With Linux and Gnome, you can have several keyboard layouts, which are very easy to switch between (much like multiple languages on a phone keyboard).

It's the same on Windows


I think that's very subjective:

- International text input: I can't remember the last time I had to use accents or special characters not in my native language; a few greek letters for formulae, and generally I can use '/alpha' instead of α. Isn't having to use chars not in your language pretty uncommon?

- Trackpad quality: yes, I love using Macbook trackpads, but I use the mouse/trackpad very little anyway, and isn't something I find useful enough to miss much.

- emacs keybindings: sounds fantastic, but a) not many people use them and b) don't know if they'd be that great for Vim-ers like me.

Even if there are a lot of things that Mac might do better than Windows and Linux, I agree that the popularity of so many cross-platform tools and the web have nullfied a huge part of it.


Re: International text input:

I only use keyboards with the english international layout although I'm from Austria and the german layout would be the natural one. Therefore the keyboard layout in the OS is also set to english international. I do this because I program.

And still I'm very happy about the easy access for special characters, because in literally every email I have to use one of ß, ä, ö or ü.


> I do this because I program.

Really? I guess it might be different with the german keyboards, but I've had no difficulties programming on a Swedish keyboard.


> b) don't know if they'd be that great for Vim-ers like me.

They are: I am addicted to vim, and I use bash/zsh in emacs mode, so having the consistency is incredible.

Also having cmd+C for the GUI and ctrl+C for the command line is awesome. This last feature is one of the few things that kills Linux for me (ended up hacking on GTK, which turns out to have both an undocumented setting and an obscure compile time constant somewhere that ends up not working. And no, remapping every single app shortcut is not an option — I tried).


English has lots of loan words using accents including common words like café and résumé and more uncommon ones like crème brûlée. It has native although uncommon accents, e.g. in coöperation. Personally, I also commonly type in Swedish which has the extra letters å, ä and ö in its alphabet, sometimes type in German (ö, ü and ß) and occasionally use an IME for 日本語 (Japanese). IME aside, losing direct access to the above accents, all of which are muscle memory by now, is one of the major reasons I probably wouldn’t consider a Windows laptop.

(I didn’t mention metakey access to typing typographically correct characters: –, “, ”, ‘, ’, … etc)


Or you could type it in two strokes instead by typing ´ then e. Typographical punctuation should be the editor's concern when it matters.


Sure, but people rarely type them out - I might ask you to meet me at a particular cafe, but less likely at a particular café.


Isn’t it chicken and egg though? People rarely type them out because they’re very hard to type out on a Windows machine.


So how do you do è, ê and ë in OS X? In Windows, it's the French keyboard (a mere Alt-Shift away), and e+', e+{ or e+}, a small price to pay for convenience. You can't get away from double keystrokes. Except for é, which is /.


on linux, using us-mac:

è: option/r-alt+`

ê: option/r-alt+i

ë: option/r-alt+u

(for example, option+` followed by e will do è)

i mostly use us-mac because i'm used to using option to type accents and because i can keep my en keyboard and type on pt-br without problems.


I vastly prefer the compose key method on Linux. You almost never have to worry about how to type a character. Type compose, hit something that vaguely looks like the accent you want, and then the letter.

Compose, ", e => ë

Compose, ', e => é

Compose, ^, e => ê

I so wish this system was available more universally.


What about ç or å or other stuff like ° or ¡ and ¿?

Apple's consistent use of the "optional" characters is a lot like the old C-64 with the graphical extras available at the push of two buttons.


Without looking anything up, I'd guess "compose-,-c" and "compose-o-a", for the first two, and that turns out to be correct.

I had to try a few things for the degree symbol, but I guessed it on my third try (it's "compose-o-o"). Once I knew that, my first guess for ¿ was "compose-?-?", which also turned out to be correct.

But as the other reply said, you can always look them up if necessary, which you'd almost certainly have to do with the Mac's system of "option + some arbitrarily selected key".


> What about ç or å or other stuff like ° or ¡ and ¿?

Those are all covered:

http://fsymbols.com/keyboard/linux/compose/


Just hold e for e view seconds and chose which version you want.


It might be easier if you have to use it infrequently, however accents with deadkeys are much faster if you type. Can't imagine typing even a paragraph while having to pause, even for a second, on every accented character (mind you, in some languages such as greek, almost every word has an accented vowel, sometimes two).


I'm barely monolingual let alone bilingual, so I'm honestly curious: how does spell check do here? Are you able to type most words without the accents and let an auto-correct tool auto-magically correct things?


I can only speak about Greek. That doesn't work very well there because many times a word is accented depending on its role in the sentence or its grammar (and via versa).

So if you just apply automatic spell-correction, you have to proof read it afterwords, which is more of a hassle than accenting it manually -because it's a more cognitive work.

Correct accents in greek matter a lot. Reading incorrectly accented words make comprehending quite a bit painful and slow, my guess is because the brain expects something, receives something else and then tries automatically to make sense of what is trained to know it's wrong, slowing the whole process down.


Somewhat on topic: a while back I turned on international keyboard support on my iPhone. When I need to write something in French on the phone, I tap the globe icon on the keyboard, it swaps to AZERTY, and the autocorrect features start picking up French and correcting accents into the text for me. Is useful :)


"International text input: in OS X, out of the box, é is opt+e, "

In Windows too, English International uses the same key layout and activates the additional keystrokes for accents and non english characters (i.e Alt+e is é, or Alt+n for ñ)


Unless there's a menu that begins with the letter E, right?


Not really but it is my bad. It is really Alt.Gr + e instead of just Alt. +e


On the other hand, OS X comes with a terrible Japanese IME. The Google one that's available on OS X and Windows makes things better but until that came out Windows was just unquestionably way better for Japanese text input.


Really? I thought kotoeri was pretty decent, although I will forever remain an ibus fan when it comes to IME's.


It's hard to notice until you're really using it a lot, but it has a hard time with stuff like people's names that Windows does better with. I might be getting a little carried away saying it's "terrible" but I was a student in Japan for a while and the limitations became obvious and annoying to me under those circumstances.


Apple does tap-to-click not too well, and doesn't do tap+tap-to-drag at all. So I have to use better touch tool, so I constantly drag shit accidentally. what a pain, both windows and linux do these out of the box.


tp+tap to drag works. enable it under accessibility settings.


> - International text input:

Just press the key for ´ and then e and then you have é. Tada!


I have never even heard of the emacs things. Can you explain further?


Many OS X keyboard shortcuts are Emacs-inspired, though they aren't extensible in the way that Emacs is.

http://www.cultofmac.com/247681/use-these-emacs-legacy-keybo...


You can extend them actually:

http://osxnotes.net/keybindings.html


On most OSes you can also compose using ´ then e for é.


AltGr+e gives me é


It gives me €.


That's interesting, AltGr+4 gives me €, UK keyboard I guess.


There's an app for the 2 software issues.

Trackpads are OK, but I like my vertical mouse and touch-screens for Windows.


There is? I've looked everywhere for an international keyboard that doesn't replace how the apostrophe key works in Windows but I've never been able to find it.

I also don't think there's an app to make every textbox behave in a standard way in Windows or Linux. If only it were that easy.

The trackpad is sadly only nice in OS X; the Windows drivers are mediocre and nothing special. Still, though, I think it speaks volumes that Windows users want to use mice all the time while Mac trackpads are so nice that there was enough demand for Apple to release one for their desktops.


The apple trackpad rules. I find it's amazingly productive.

No Windows trackpad has even come close.

There is a nice parallel here. Take advanced Mechanical CAD applications. Pro Engineer, Solidworks, Siemens NX, etc...

Back in the 90's, there was a lot of basic UX work being done on view manipulation. It's a hard problem in some respects. Modes, focus, picks, and a complicated thing to spin around and get into.

A company called SDRC spent a bunch of money on UX research. They ended up using a few of the function keys to trigger view manipulation. And it worked no matter what mode, pick, dialog, or anything else was going on. Most importantly, it only required mouse movement. No holding one button down while moving the thing. No finger fatigue after a lot of view manipulation.

Other companies didn't do this work, and their view commands were modal, or required complex mouse button hold + move operations.

A little company making 3D input devices grew a nice, profitable business out of poor UX design. Funny, their sales to SDRC people were nothing compared to the sales for most Mechanical CAD programs...

So, getting back to that track pad on Apple and UX in general. The Apple OS + track pad has it down cold. It's functional in nearly every use case, and it's low fatigue, and it's fairly discoverable as well as usable.

In Windows land, we get better mice to improve on poor UX. We get touch screens to improve on poor UX. And we get other things to improve on poor things...

People may not agree with all of what Apple sees as a value add. Totally get that. I personally won't touch the closed down iPad and iPhone. They aren't useful to me.

But, the UX on those is very solid.

I do the vast majority of my writing, coding and other non Windows specific work on a Mac Pro, and it's because that UX is there, and I don't need anything but the machine and a nice place to work.

When I go to power up on my Windows machine, out comes the mouse, and the utilities, and, and, and...

Not the same experience at all.

All that said, touch on the Surface does provide some options, but I personally don't find extending my hand any where near optimal compared to a quick gesture on the trackpad that is always right there, and that always does the same thing for the same gestures...


I use the Microsoft Keyboard Layout Creator[1]. This way, I can start from the current layout associated with my keyboard, and add the key I need. This way, It does not changes anything that I'm used to. It only adds new things. This might be what you're looking for?

[1] https://msdn.microsoft.com/en-us/goglobal/bb964665.aspx


Every single Windows trackpad is shit, and the common feature of most of them is Synaptics. Windows laptop makers could probably get near double digit market percentage increases from fixing their shitty track pad situation alone.

Android and Microsoft phones and tablets' touch sensitivity is great, so it's not some secret of Apple's.


Yes, AutoHotKey can easily take care of the issue with accent keys and with the Emacs shortcuts you want to implement. A search for `autohotkey accent characters` shows many people using it for this purpose. You can implement all kinds of keyboard shortcuts with AutoHotKey.

On the topic of trackpads - maybe you could help me understand - when you use the keyboard and trackpad that's built-into the laptop, is it because you're forced to (in a plane) or because you prefer it? I imagine all Mac users who say they can't ever leave the Apple trackpad are environmentally limited to using whatever is built-in to the laptop.

In other words - when I work, I prefer 2 or 3 monitors that are situated about 2 feet from my face and a full size keyboard with numpad (Microsoft Natural). So, I just could never see myself reaching all the way over to the laptop that is driving all of this in order to use a touch pad. But even so, I find all touch pads including Apple's to be imprecise compared to my 5 button vertical mouse. I also have auto-scrolling on Windows which OS X doesn't really do (this is where you push the middle mouse button down and then as you move the entire mouse, things scroll, until you release the button).

I tried using an Apple wireless trackpad (magic trackpad?). Maybe you use one of those with your work setup?


Every time I update Windows, or even restart my windows laptop I hope that it'll still work afterwards.

Updating has previously killed my dedicated graphics card, which turned out to be a driver issue. With Intel providing 2 nearly identical drivers one of which kills it.

A while ago the fingerprint reader just stopped working after a restart. After a full hour of debugging and restarting I gave up. The next day I figured I'd try one of my debugging steps again and it started working again.

On my Mac, the worst that ever happened was a corrupted HDD. Booted into recovery, used disk utility to find and fix the problem and never had issues again.

Also, dat touchpad, nothing competes...


> Every time I update Windows, or even restart my windows laptop I hope that it'll still work afterwards.

That's strange. I'm now sitting in front of a Windows box that had W7 installed on it 3 years ago. I've upgraded to W8, then swapped the SSD in it (restored to the new SSD from image), resized the system partition many times (it's a Windows-Linux dual boot box), then upgraded to W8.1, then Windows 10. Oh, this is the 3rd GPU in the machine. I have never, ever had any problems with Windows itself throughout the 3 years. (I had problems with Linux though.)

I have an EUR 2500 Macbook Pro as well from 2013, and the Windows box has always been more responsive/usable.

Anecdotal, I know.


Mo anecdote. I'm unable to update to Win8.1. It keeps asking me to, I've tried twice and it's downloaded everything, it installs, reverts, gives me error id 0x80070002 - 0x20009.

Googling: http://tipsandtricksforum.com/thread-154.html

"If you connected a second drive or any external drive to your PC (a SSD or HDD) the upgrade will fail if the second (original) drive is not disconnected."

Ok, I have a second internal drive but it's internal and I can't disconnect it easily. I must take my notebook in to update Windows?

Now I'm being badgered every week to update. This is not the user experience I'm looking for.


I use linux at work and run Windows 7 in a VM. It's domain joined and my company pushes updates to it seemingly many times a month. This rears it head more often than not when I'm going home for the day.. Sometimes I can't be bothered waiting for updates to install so I throw caution to the wind and power the VM off when it says not to... Still ticking!

I run the opposite on my home PC, Windows host and linux guest. Been through multiple video cards and even an intel to amd switch. Also still running like a champ.

Anecdata :)


Twice now I've been stuck during a power failure with just a few minutes of battery in my UPS, and when I try to shutdown Windows, it goes into its "installing updates" thing. There's no way around it, and of course losing power during this is probably the most dangerous thing imaginable. Luckily, I escaped with my life on both occasions.


When I use to dual boot windows I never updated and always "restarted" by just holding the power button down (stupid, yes I know, but it's the only way to reboot it as fast as I'm used to when booted into Linux). Never installed updates and it always worked fine, until I wanted to install updates, then everything broke. I assume this was just caused by waiting too long between updates. My solution was to delete the windows partition and just use a VM.


This is why I never shut down my laptop just before going home.


Just a heads up, you're probably going to have issues trying to go from the Windows 10 tech. preview to the actual release.


We'll see. MS claims I won't (http://winsupersite.com/windows-10/windows-technical-preview...). However, now in 2015 I can set up a Windows box in an hour or so thanks to Chocolatey (and to the coming OneGet).

(I have actually upgraded to 8.1 CTP I think, then continued all the way to 10)


My comment was anecdotal too, no worries ;)


The problem here is not Microsoft. I am surface pro 3 owner and i have to say that is one of the best machines i have used. I wouldn't change it for anything. I agree that MS is heading in the right direction. The steps that they are taking are to my opinion great for everyone. I have been a linux user for almost 10 years and windows 10 might be the reason to switch back. We will see.

As for the driver issues, that is cause windows releases the software and then the hardware is provided by several companies which is up to them to not screw up there drivers. For apple they release one laptop and that is it. They only have to cater for that device which makes it a hell of a lot easier.

Windows 8, has made huge progress in every hardware aspect of the machine.


To me, as the user, whose fault it is could not matter less. I only care if the system works or not. In the event of a driver failure, knowing it's not MS's fault makes the failure no less impactful.


Do you think that Windows 8 made progress because your machine works or for technical reasons? Because driver issues seem to be pretty common on Windows 8. And the OS offer little support in fixing those issues when windows update fails to get the drivers right. Nor does the OS warn you or gracefully degrade when things are not working properly. I would much prefer the approach taken by Chrome that simply turns features off (like OpenGL) where support is lacking.


It absolutely is Microsoft's... well, not fault, exactly, but the way the ecosystem has evolved has been driven largely by choices they've made. If the ecosystem allows for buggy hardware driver releases to screw things up for customers, it's because MS made the choice, possibly by default, not to run an Apple-like ecosystem.

The reason I don't blame MS entirely for this is because it's clearly a trade-off. In return for the risk of unreliability, they got a vast increase in the number of hardware manufacturers willing to play in their ecosystem, precisely because MS can be fairly hands-off. This drives prices down, and increases the size of the market-place.

Where I do take issue with MS directly is that despite consciously being in the driving seat of an ecosystem like this for so long, it's taken them years to successfully isolate the user from buggy drivers. They control the OS, so they control the ability of a manufacturer's bug to affect the user. It's not a foregone conclusion that a screwed-up driver must inevitably lead to a bluescreen, or anything like it.


>They control the OS, so they control the ability of a manufacturer's bug to affect the user.

Ugh, kernel drivers will always have bugs and that cannot be isolated from the user all the time. As you stated, the decisions Microsoft took got them to their throne.

I've switched to OSX but i still use Windows 10 on my other laptop and Windows 8.1 on my gaming PC and haven't had a bluescreen since Vista. I literally have never seen it for years. I was shocked to see it changed appearance in a screenshot in an article about Windows 8. And the "upgrade" to Yosemite has completely destroyed my wifi stability, made the system sluggish (Safari even decides to stop smooth scrolling after a few hours), boot time has pretty much doubled with the same amount of apps installed (SSD too).


> Ugh, kernel drivers will always have bugs and that cannot be isolated from the user all the time.

And MS are in control of how much of a driver needs to be in the kernel. It's not a foregone conclusion that a driver bug cannot be isolated from the user. The techniques involved aren't new. Hell, I don't think they were new when Windows 95 was being written. Again, it's a trade-off; it's easier to write an OS where a driver lives by default in kernel space (certainly with reasonable performance), and where to draw that dividing line is a choice which Microsoft made. They've introduced user-mode drivers for some things, and this is a very good thing.

I don't have a dog in the OS X/Windows reliability pissing match, but part of the reason you won't have seen a bluescreen in years is because Windows 7 (apart from being more reliable from having more man-hours thrown at it, along with an updated driver model from Vista) changed the default behaviour: what would have caused a BSOD in Vista causes a reboot in 7, so you never actually see the error.


>I've switched to OSX but i still use Windows 10 on my other laptop and Windows 8.1 on my gaming PC and haven't had a bluescreen since Vista. I literally have never seen it for years.

I can reproducibly blue screen Windows 8.1 with the latest ATI drivers (as well as every driver for the last year and a half) by watching hardware accelerated Flash movies on an ATI HD 7770 GHz Edition. Swapping the card to a different ATI card fixes it. Swapping the card to an nVidia card fixes it. Swapping the 7770 GHz for another 7770 GHz does not fix it.

This only happens in Windows. It does not happen with the ATI drivers in Linux.


Microsoft were not initially in control of the ecosystem; it was IBM's PC cloned by Compaq and then a horde of others. At one point there were competing desktop environments - Desqview, OS/2.

They then faced the upgrade problem: at any point, restrictive control over drivers would mean that hardware that ran Windows N-1 would not run Windows N.

There is also the antitrust argument: control over the OEMs restricts the possibilities for competing software.

And there's a limit to what they can do. My latest driver/hardware bug was discovering that my Crucial MX100 has some horrible interaction with write caching and link power management such that it can vanish from underneath the operating system.


> Microsoft were not initially in control of the ecosystem; it was IBM's PC cloned by Compaq and then a horde of others. At one point there were competing desktop environments - Desqview, OS/2.

...

> There is also the antitrust argument: control over the OEMs restricts the possibilities for competing software.

Which of these arguments would you like to keep? You can't have both; or at least, they don't refer to overlapping time periods. And I'd take issue with either: in the first case, MS always had the option early on of carving a controlled niche rather than trying for wide appeal. That's exactly what early Apple did, and I'd argue that MS could have done precisely the same on PC hardware, if they'd chosen to. As regards the anti-trust argument, why don't we see that being levelled at Apple? An MS which chose to drive a more closed ecosystem probably wouldn't have been so utterly dominant as to run the risk of antitrust accusations, and again, that was their choice.

> They then faced the upgrade problem: at any point, restrictive control over drivers would mean that hardware that ran Windows N-1 would not run Windows N.

Again, that was their choice.

> And there's a limit to what they can do. My latest driver/hardware bug was discovering that my Crucial MX100 has some horrible interaction with write caching and link power management such that it can vanish from underneath the operating system.

I'd argue that this sort of problem affects either sort of ecosystem.


I've been running Win7 on a Dell Optiplex 980 for years, and have never had a BSOD. Except yesterday when I pressed Ctrl-Up arrow/Ctrl-Right/Ctrl-Down in Visual Studio. That was just asking for it :o)

(Ctrl-Up flips the screen).


I've been running Debian Wheezy Linux for years. A few minor issues in Word 2010 (it always opens on my top screen!) and playing GTA4. Never had a BSOD. :)


On the other hand, updating my Mac to Yosemite turned it into a slow stuttering mess - WindowServer memory leak forced me to reboot the Mac about every two hours or so or watch my desktop at < 1fps.

While Windows does have problems, latest Apple software is mostly worse bug and update-wise.


These kinds of issues are why I always suggest people do fresh installs of operating systems, and not upgrades.


I used to do that, then I released how many pieces of software I have installed, and tweaked and adjusted to my preferences. Usually there is little benefit to an upgrade so I just leave as is (or use a rolling release Linux distro).


I get similar memory leaks with WindowServer on a brand new macbook pro. The damn process is eating 4gb of ram sometimes...


I did a fresh install from a seemingly healthy, semi-recent MBP and Yosemite's WindowServer seemed to begin a slow strangulation on my machine.

Solved it by buying a top-of-the-line machine, which I'm sure Apple is loving.


I discovered a weird interaction between f.lux and fullscreen gaming. They seemed to fight about which gamma etc to use.

It was nothing you could see, and other than a jerky in-game mouse you did not notice it. And after an hour of playing, dying cmd+tabbing out to Safari to surf while waiting for respawn, WindowServer core dumped, logging out all users in an instant.

This is fixed in 10.10.3. The jerky mouse pointer problem still remains. I don't use f.lux anymore.


Gotta be a hardware issue of some variety. I'm running Yosemite on a MBPro Retina mid-2012 with zero issues at all.


Hmmmm, possibly my slow-as-could-be hard drive. The SSD in my newer machine really does make a HUGE difference, to speak obviously ;)

I have the feeling my old machine is using a bit of VRAM when WindowServer starts getting too heavy.


Updating my iMac to Yosemite fried the graphics card. It seems Apple is putting form way ahead of function, and as a result stuffing too hot components into too small space (the same iMac has previously fried the screen and the HDD to death).


And way back in time, Apple had even factory-disabled my second NVIDIA card on my MBP because it had created heating issues in tests. So, I installed gfx, or something, to get it working. And then, predictably, major major heating issues.

And it was easily solved! Apparently Apple (the rumor is Jobs ordered this, but who knows) didn't want the fans spinning too fast, which caused noise. So they governed the speed on those, too. Solved with SMC Fan Control.

I wasn't too happy about being somewhat deceived by their specs for my equipment, even if they thought their intentions were "good".


Thanks! I was just thinking I might take a chance, but at this point I'm actually thinking about reinstalling what came with the machine. These new operating systems are literally making me angry.


This comment just made me realize why most computers sold today wouldn't work in vacuum of space. Even if we shielded it from radiation, they'd just fry to death.


Yeah, but they've been doing that for years. Remember all the people burning themselves with Macbooks?


Everyone has different experience. I never had hardware problems with HP, Lenovo, Dell notebooks. It was built badly, it could broke something, but notebook worked well. With Macbook I had broken HDD which was replaced by warranty but it took almost a month without my working tool. And 2 years later sound just stopped working. Headphones work but buggy. I don't think I can pay for repair, probably will tinker with it myself.

My iPhone 4S had broken WiFi just after warranty expired. All cables from Mac and iPhone are in terrible state.

I don't really think that Apple hardware is better than others. Not worse than others, but not perfect. Choice is between OS X and Windows or iOS and Android. Otherwise there are many comparable machines.


> Everyone has different experience.

Yeah, I'm acutely reminded of this every time someone brings up Windows 7's predecessor. Apparently I'm the only person who ever lived on this planet who had absolutely zero problems with Windows Vista, and even liked it quite a bit.


Nah, aside from some minor annoyances that ended up getting polished in updates I didn't have any issues with it either. When it came out I wasn't making much money so I wasn't running top of the line hardware by any means. Still, it ran OK on my PC at the time (Core 2 Duo, 1GB RAM, and some low-ish end nVidia GPU). Still, even my budget Newegg Special was better than the real dogs some people were buying at the time (or some of the older computers they were trying to upgrade).

Vista seemed like it was built to leverage the current hardware of the time rather than run nicely on weaker systems that companies were still selling because they could passably run WinXP (slow CPUs, 256-512MB RAM, integrated GPU). It made sense because both OSX and Linux were capable of GPU acceleration of desktop windowing, aggressive use of caching things in RAM, etc. but too many users were still purchasing the absolute cheapest thing they could find that would still boot.

Then you had all of the hardware issues caused by companies not bothering to write new drivers for old peripherals and the software issues from applications that just assumed you were running as root/admin. I didn't have any old scanners or printers or anything that needed driver updates and very few of my important applications weren't updated to work under the new UAC setup.

In many ways, I feel like most of Vista's problems came from 98SE and then XP being "good enough" for so long that a lot of companies and users weren't used to needing to choose software and hardware more carefully.


The biggest problem with Vista for most people was that it was sold on computers without enough RAM. We have a couple in our office (thankfully not used regularly) where it's 1 GB and the integrated GPU borrows a quarter of it for VRAM.

Windows Vista on 768 MB of memory is not a happy thing.


I can imagine it. I guess the experience is sort of like the one I had with a cheap Android smartphone, where the hardware was too weak to handle the OS itself (and if you tried to connect to the Internet, you basically had to pull the battery out because it went totally unresponsive).

I'm not sure why such things happen. Selling a product where the hardware is below minimum reasonable specs for the software it needs should be banned, or something.


I didn't have any problem with Vista either and I feel your pain, most people I speak with seems to love to hate Vista.

I also have a similar feeling when somebody tells me that Windows 8 was only good for touch devices (I enjoyed it in a HP 8710w and didn't miss the touchscreen at all)


My impression of Windows 8 (for most use cases at least) has always been that it's essentially Win7 with the beginnings of some "transitional" stuff tacked on. Still, they are mostly unobtrusive unless you never moved away from the old "drill down through menus" method of launching programs from the Start menu. Vista introduced a search function (to finally catch up with OSX's Spotlight) that made finding and opening programs or files much faster. Then 7 combined the quick launch bar and the taskbar to include pinning common apps to the bottom of your screen (more like the OSX dock). So between launching 90% of your programs from the taskbar and search&query for the rest, the Start menu was mostly legacy...

...but with Win8 I've learned just how many people never stopped doing things the old way. Does a full-screen Start make sense? Not particularly. But if you're just hitting the Windows key and typing the name of what you're looking for, you only see it for a few seconds. And while you're looking at the search results, you aren't really working actively on what's in your other running apps. Otherwise, you never really see it if you don't want to use it.

I'm glad 10 has shrunk Start back down because my minor complaint has always been that the full screen Start was just a little "jarring" but that's about it. If nothing else, it will make the old school Start menu die hards (read: the faculty I deal with) happier I guess.


> But if you're just hitting the Windows key and typing the name of what you're looking for, you only see it for a few seconds.

Which I'm probably not doing if I'm going from one program with a WIMP-optimized interface to another program with a WIMP-optimized interface; context-switching to keyboard-driven mode is a bigger UX loss than the start screen is there.


If you had a powerhouse computer it was fine. The midrange laptop I had at the time was not happy with it. It also broke compatibility with most of the games I had.


Make that two persons, I remember that it was quite ok on my machine.


Absolutely the only one issue I had with Vista was that system file copying went a little slow, and you could get 2x+ speedups by just using Total Commander instead. Apart from that, I had pretty much problem-free experience with Vista.


> Also, dat touchpad, nothing competes...

Trackpoint. Makes it very hard for me to switch from Thinkpads.

Also, I'm growing quite fond of having a touch screen and a pen for my laptop.


I've used MacBooks for years, but I still miss my Trackpoint. I cranked its speed and sensitivity up to max and trained myself to adapt to it. After that, I could do anything with just an imperceptible twitch with my fingers still on the home keys, thumb on one of three buttons. It's like the vim of mice. If I could get one as an option on my MBP, I'd do it in a second.


I can imagine, must be the opposite of actually starting to work with the Trackpoint. I tried, I really did, but I never got it to work for me.


It's funny you should say "Every time I update Windows, or even restart my windows laptop I hope that it'll still work afterwards. Updating has previously killed my dedicated graphics card, which turned out to be a driver issue. With Intel providing 2 nearly identical drivers one of which kills it."

That's funny because …

I just surrendered my MacBook Pro to an Apple Store for service yesterday because the external display mysteriously stopped working and the system won't fully boot. The on-site service tech was super competent but totally baffled and sent the machine away for Apple to replace components until things start working again.

And today Apple just released OS X 10.10.3 Supplemental Update which has description "The OS X Yosemite 10.10.3 Supplemental Update fixes a video driver issue that may prevent your Mac from starting up when running certain apps that capture video."

Hmmmmmmm.


If you have a late 2012 or early 2013 MBP you will know that your MBP will restart every time you played a video from youtube because they had a faulty GPU and refused to acknowledge it. There are problems with everything including the chromebooks but Apple's focus on quality is making Microsoft think twice and as a result is lifting all boats.


Anecdotal evidence. Here's mine:

I never ever had any issues with any of my 3 desktop PCs or with mit X200 in the last 5 years. With my recent Thinkpad X1 carbon I had to boot only once into safe mode after upgrading some Lenovo provided driver after almost 2 years of use.


Then again, upgrading has made the wifi on my mac mini go wonky, twice, and last week an upgrade made it stop recognizing the display's native resolution and downgraded it to 1024x768. Upgrades are always dangerous.


This is how every OSX/iOS developer feels when they update Xcode or an iOS SDK. And Apple forces you to update in most cases, so you can't avoid it. They are breaking stuff left and right.


Until an OS comes along that rivals OSX, I'll stick with Apple. I'm one of the few people I know that really doesn't put a whole lot of thought into hardware. Yes I need my machine to run properly but as long as there's a baseline achievement of performance I'm good.

Obviously I'd never be able to give up OSX for work purposes. Pretty much every piece of software I use is tied to OSX only.


What software are you using that is only on Mac? I'm surprised by this, as it seems most development software is available for all platforms these days.


This is from a dev standpoint but at my job we're using MySQL and I develop from Linux. When I want to browse or modify data with a GUI nothing quite matches OS X's Sequel Pro.

The best I've found is DBeaver (or SQuirrel). IntelliJ is working on a thing called 0xDBE, but it feels even worse.

I could easily rattle off a handful of other applications that just feel better, even if slightly, on OS X. DaisyDisk and Transmit come to mind right away.


I'm a dev that uses Windows at work for development and the Mac at home for development and at work for general use. I've been a Mac devotee since 1984.

For an IDE nothing beats the sheer joy of Visual Studio and ReSharper. I use NetBeans for Java (I prefer it over Eclipse and IntelliJ). For MySQL I use Sequel Pro on the Mac but prefer HeidiSQL on Windows. For Linux dev work I'll spin up a VM on Fusion on the Mac or VirtualBox on either.

My point? There's nothing you listed that requires you to use a Mac. I completely understand if you choose to, but there's no need to discount Windows in the process...it's a great development environment for anything but Swift and Objective-C.


Try MySQL Workbench - this is what I miss the most after switching to Postgres


> When I want to browse or modify data with a GUI nothing quite matches OS X's Sequel Pro.

I've moved the opposite way (Windows => OS X) and have used SQLYog Ultimate Edition [0] on Windows and like it a lot -- but find Sequel Pro underpowered and slower to use, and wish there was a better alternative.

0. https://www.webyog.com/product/sqlyog


I've never used Sequel Pro so I can't compare them, but I've been very happy with SQLyog, when I was working with MySQL.


If you ever need to use Postgres I recommend using the gui http://www.psequel.com/ which is a pretty decent port of Sequel Pro to the PG world.


SQLToad is unmatched on Windows IMO.


- Xcode

- Sketch

- Pixate

- Framer Studio

- iTerm

- Fluid

- Keynote

- Pages

- Quick cast

- Cloud App

Granted I haven't looked into a lot of these to see if they've picked up Linux or Window's support. However I doubt any of the one's I've listed are all on Linux, which I'd for sure need to do any development work.

Some of these I'm sure have alternatives but the one app I just can't do without is Sketch. Gimp doesn't compare and wine + photoshop hackery on Linux is a nightmare.


There's a pretty neat vector graphics app for Linux called Inkscape. I used it for some time when I worked on Linux and got used to it. Of course, I was happy when I could go back to Adobe Illustrator, but for a free app Inkscape is pretty great (and it's the only usable graphics app on Linux)


Side note but, not sure why anyone would bother with wine + hackery these days. Virtualbox works perfectly fine for running any Windows apps you "can't do without". Throwing a few gigs of RAM at running a Windows VM to run these apps is not really an issue if you have a modern rig.


More reliable approach for powerful machines, but for simple use cases Wine is fine:

- Only for a few apps, like Evernote. - Laptops on battery.

For Visual Studio I'd bet on VMs over Wine, but then I probably wouldn't be using Linux.


By "modern rig" are you including laptops? I've always been deathly afraid of running VMs on battery power; has this changed recently?


Nothing much has changed in the past couple of years. Wine is still a good go to option if your Windows application will work with it correctly.


Are you sure? Saying Xcode is only available for Mac is like saying "Visual Studio is only available for Windows, so I'll stick with Windows". I thought you'd come up with some movie maker stuff which might not be as good on Windows.

I used Xcode a few years back and it was a nightmare with the limited things it allowed me to do when compared to Visual Studio.


Really? You're going to pick out Xcode out of the whole list? I even clearly stated what the most important app I use is and it wasn't Xcode.

...however, if you're developing iOS and OSX apps you need Xcode. So I'm not sure what your point is.


XCode is a pretty good choice for writing Mac applications. I'm not aware of many accepted alternatives.


There really is only one–AppCode, and not really an alternative, because usually you end up using both.


My point is that you can really come out of these forced incompatibilities if you want. There are multiple IDEs to develop the same thing on Windows, but of course, Apple does not want that. That's why, the author of the article had to leave developing for them - in my very personal opinion, everyone should do that, in the hope that it will force Apple to strive for a common/universal development language.


Visual Studio and Microsoft Project mainly. But then Windows users don't get XCode.


One thought I had about MS as a development machine a while back:

If you want your Unix toolchains for development, you could still have them. The Surface Pro is powerful enough to run Linux in Vagrant or another virtualization engine. So you could just run a virtual dev environment. Virtual environments are sometimes better anyway, since you can replicate your operational infrastructure.

Windows 10 is looking good. If it gets really, really good they might close the gap with Apple for professional users. Their glastnost/perestroika trend of opening up and embracing the dev community isn't hurting their reputation either. I really think Satya Nadella is turning the company around.


I really hate using WinSCP for file transfer between Windows and the guest OS. If Windows shipped with a native sshfs-alike, I'd probably be ok, but I really prefer native Ubuntu development.


https://chocolatey.org/

Choco install git.install /GitAndUnixToolsOnPath Choco upgrade git.install

It is a beautiful world with and my slashes work both ways!


There are lots of alternative solutions to share file systems between a VM and host ... is that all you're trying to solve? VirtualBox has it built in.


No, I'm also trying to share between my laptop and a remote server. (actually, I very rarely use VMs hosted locally)


I think you confused everyone with your terminology in your parent comment - "guest OS" is VM speak.


I use a samba share, it works a treat and in the 3-4 years I've been running it I've never had much maintenance on it outside of the occasional update to the OS.


Using (boot2)docker/docker-machine+virtualbox and mapping a volume to the host OS (which is trivial) might be viable.

I use it on OS X to do linux-only stuff (such as loop-mounting an ext4 image).


I don't have a click&go solution because I don't have this problem, but you could certainly serve a Samba share out of the guest OS or something.


What's wrong with VirtualBox shared folders? Also, Windows-native sshfs-alike is SMB.


You can map drives from a client over Remote Desktop if that's at all helpful.


I use cygwin for ssh/scp, even if I don't need cygwin for anything else


Set up samba? (thats how I work) What about NFS?


I run Fedora 21 w/ full screen Gnome 3 in VirtualBox, so I can use Linux desktop tools and no SCP is needed.

The touch screen even works with Gnome. Even rdp remoting into my Surface from my desktop to use the Linux guest has also been the best performing Windows to Linux remote control that I've used for the amount of setup time. On a gigabit network, it's really nice.


Absolutely!

I usually buy windows machines because of price/performance, and, to be honest, because I know how to make windows work for me and don't want to learn mac idioms.

However, the first thing I do is install git for windows, which gives me bash to navigate the filesystem, and then vagrant/virtualbox/ubuntu for development.


I highly recommend http://babun.github.io/ for Windows based development. It's a well configured cygwin install with many standard addons included. Helps me keep my sanity while switching between Linux, Windows and OSX.


Have you given PowerShell a try? If so, how did you feel about it compared to using bash on Windows?


I don't think it is really possible to 'give Powershell a try'. PS is extremely powerful, but there's a lot to learn. It is not a drop-in replacement for bash at all because it's so different.


Thats exactly what I tend to do on a Windows machine, and I don't miss linux at all. Feels like having a VPS (which it really is) ready at all times with awesome bandwidth.


That being said, I just returned a surface pro 3 for a tablet. The display is nice, the feel is great, but the omg the resolution and font issues. Forget it, you consistently feel how much you have to deal with the OS itself to get it out of your way. Until tablet pcs are ready I still have a desktop win box and a mac mini and my tablet.

I'm also spoiled by the App Store concept. Windows needs to force it on developers. One tap installs on Android and great dependency tools like homebrew or Leiningen have made me never want to download stuff manually ever again.


The day Microsoft forces the app store on me is the day I stop developing for Windows.


I'm not really a fanboy, so maybe my expectations are lower. I think Apple are doing a good job. I admire what I think of as Apple's unique quality: the product/marketing/tech strategy.

Apple insist on charging a premium. In addition they prefer to play the above median range of the market. Apart from first-to-market "freebies" they don't play for dominant market share. They insist on that premium, at the expense of market share and other trade-offs.

They don't play for check box ticking. They don't adjust to appeal to mobile service companies, IT purchasing departments or other buyers that make decisions about what other people will use. IE, it's not corporate sales departments dictating strategy according based on short term sales.

In order to get their premium they make stuff appealing enough to a subset of people that they are willing to spend their own money, pay that premium and get the thing that they decided they want for themselves. To make a stretched, contrived and over-complimentary analogy.. It's like a politician who wants to convince the voting public to join him in his position vs a politician that checks the polls to find out what his positions are.

Like I said, I don't have that much loyalty to Apple. I don't buy iphones because androids @ half the price are pretty good. I'd be interested if Apple puts out a laptop with a sub €500 price, but I'm not cranky if they don't have one. When they do happen to have a product that I buy for whatever reason, I'm usually on the fence mostly because of price but fairly well impressed once if I buy it.

But….I suspect the €1000+ laptop market is going to keep shrinking, and it's getting harder to justify buying a macbook. I still expect to get years out of the one I have, but I wouldn't be surprised if I opt for a €200 something when the time comes. Unless… Apple happens to be making something I'm willing to pay extra for at the time.


I suspect the €1000+ laptop market is going to keep shrinking

Possibly, But the high end laptop mark is eating into the workstation market from the bottom now that 32GB and and decent Quadro cards are common in laptops. At work basically all our CAD and 3D modelling people use laptops, whereas a few years ago they all had workstations. Same seems to be happening with video and music production. Even more and more of the gamers I know are option for laptops over desktop computers.


However a Macbook Pro is not a viable product when compared with a ThinkPad W540 in terms of features/price for 3D related work.

More expensive and less powerful GPU.


Oh agreed. Apple has no presence is this market. My point was about the market for expensive laptops in general.


As a gamer that is one of the reasons I never bought a Mac laptop even though I was a big NeXTStep fan.

My home is desktop free since 2003 and as you say, Apple does not fulfill such market.


Why not use your Linux laptop more? About 6 months ago, I installed Linux Mint on my Lenovo laptop, and I have barely looked at my MBP since.


I think it's called Victory Disease. When you think you have been winning all the time and it will continue, your way of thinking becomes polluted, less clear, biased, slow etc.


>Microsoft seems to be heading in a good direction, Apple less so. Things like living in a web browser and a platform neutral IDE (I use IntelliJ for Clojure, Java, Python, PHP, and Ruby) make OS platform less impactful on using a computer. I am looking at Surface 3 Pros and Surface 3s, price and feature wise, vs. Apple products and I find Apple losing ground.

To keep this on the rails, I have to wonder exactly how much Apple has paid out to developers on the iOS platform vs Microsoft on their mobile platform. The very first reason the author of the linked article cited for his decision to pull out was competition, which combined with having the largest payouts by a large margin speaks to the health of the platform imho. Native vs. web apps has been debated endlessly, but the doomsday scenario for Apple's approach has which has been trumpeted for many years now has conclusively not come to pass. I suppose if you extend the time horizon out to some Nth degree that may change, but what has Apple been doing in the meantime? Making a ton of money and paying a ton of money to developers.


Interestingly, the increasing role of the web and other platform-neutral software has pushed me toward Apple products. I now care much less about the massive amount of software written only for Windows, so I'd rather just buy the Apple product that I know is going to be a reasonably solid experience from purchasing, to updating, to customer support if necessary, to even selling when I upgrade.


My Surface 3 Pro is my go to, take with me device. I always know I have everything I need, whereas I obviously don't with my iPad. Hopefully the next Surface Pro will have a longer battery life, it is still a few hours short.

I recently started using the Windows Chocolately package manager, strange name, but works great and makes setting up a new machine and maintaining utilities much more simple.


Chocolately + Boxstarter = ./build.sh

To build my work environment on a brand new install all I have to do is Start -> Run -> https://boxstarter.org/package/nr/url?https://raw.githubuser... as administrator.

See the following repo for more information https://github.com/ghuntley/dotfiles-windows/blob/master/des...


> Microsoft seems to be heading in a good direction, Apple less so.

Just have a look at the enormous amount of open research available at http://research.microsoft.com and try to find the equivalent of Apple. That tells me enough.


Nothing new there. Microsoft has always been very open their research, and Apple hasn't. They prefer to talk about products they will actually ship vs. things that might happen at some point in the future.


Apple simply isn't doing the research, at all. You can see the results in things like Swift (and Swift's insane bugginess).


I wonder what self-respecting scientist would want to work in a company like Apple where publication of research results is prohibited?


I was afraid of what might happen to Apple after Steve Job's death. He said Tim wasn't a gadget guy. I am trying to give Apple a lot of grieving room, but enough is enough. I'm particularly irritated because I'm having OS X and iOS problems simultaneously. I'm actually so impressed with Microsoft lately, I'm thinking of putting all my meager savings on their stock. This is coming from a guy who literally thought Microsoft was evil.


For general consumers, Microsoft is catching up. But Apple never really gained marketshare there because of its high prices.

For Professionals, (especially software developers), I don't think so. OS X is still very superior to Windows. And the apps on OS X makes it worthwhile.


Well for software development the only problem with Windows is that it isn't unix and a lot of the web dev world lives on unix. In my experience Windows 7 is much more stable than for example Yosemite, though I would really prefer to use some linux distro.

Osx just seems to have a lot of problems with very basic stuff like beach balling various system dialogs, failing with external displays etc.

The hardware is nice, especially the trackpads are way better than the stuff for example Lenovo puts out. Though I have to admit that I really don't like aluminium as laptop material.


I think the representation on Hacker News is somewhat skewed towards OSX developers, I'd be surprised to learn that OSX has ever outstripped Windows or Linux in terms the platform the people develop on. Certainly not as something developed for as there are a vast army of developers out there who target embedded systems, set top box, micro-controllers, industrial platforms etc.


What surprises me is the sheer amount of professionals that buy a MacBook Pro, and install Windows on it.


They just want the hardware. They do not want to learn how to use a different operating system.


I'm not talking about marketshare. I'm talking about superiority.

I have been using Windows religiously for 15 years. I moved to OS X last August and it's way "too" superior to windows/linux.


> I'm talking about superiority.

You misspelled 'personal preference' there.


I work for a medium sized technology organisation (about 400 people) and lately you hear more and more developers talking about moving away from Mac. Most are going to linux some are picking up windows. I'd say it's about a 80-20 split.

The thing I really notice is how everybody with a mac has a machine that is newer than two years. Those who are hitting the two year mark are experiencing hardware issues, particularly lighting, which is needed for dexternal montitors, but it seems trackpad longevity is also an issue.

For most of us, the apps on OS X are also available on other platforms. The only thing I can't get on Windows that mac devs get is Sketch, but the designers post on invision anyway, so it isn't a huge loss.


I have a laptop from mid-2009 that's still good (it's a white MacBook). The only problem I've had has been the battery bulging and Apple replaced them under Applecare with no problems. The last I had replaced out of warranty for no charge. It's slow, but everything still works. I used it for my main machine until late 2012 (when I bought an i7 Mac Mini) and was my portable machine until late 2013. My Mac Mini has been pretty heavily used since 2012.

Everyone in my department at the university who switched to a Mac has had good longevity, most lasting about 4-5 years until they decide they want a new machine (but the Mac still works). I had two Mac Minis in the lab, one a G4 Mac Mini from 2002 and the other a first-gen Intel Mac Mini (it came with 10.4, I don't recall the year). I used the Intel Mac Mini until I graduated in 2013 and had no problems at all.

So, anecdotes are just that. My department consistently has been moving to OS X. Probably about half the professors have switched when there was just 1 professor who used it in 2007. Even my supervisor didn't switch until about 2010. He's still using the same iMac he bought then. The labs mostly use Windows because of the data acquisition software only runs on it as they tend to have specialized equipment that only wrote Windows drivers.


> I have a laptop from mid-2009 that's still good ... was my portable machine until late 2013.

I don't think this should count as a counterexample. You have switched laptops in 2013, and while there could be many valid reasons for that (battery life, more power for development, your old laptop does not support a new OS) it made you old laptop inadequate -- and hence the switch.


The only reason why I switched laptops was because I got a new one as part of my work. I'd still be using it otherwise.


> The thing I really notice is how everybody with a mac has a machine that is newer than two years. Those who are hitting the two year mark are experiencing hardware issues, particularly lighting, which is needed for dexternal montitors, but it seems trackpad longevity is also an issue.

Having worked at a company with +50 MacBooks used for 3-4 years, it was really uncommon to see hardware failures besides batteries or power supplies. So I guess we can't generalize out of one experience.


>The thing I really notice is how everybody with a mac has a machine that is newer than two years. Those who are hitting the two year mark are experiencing hardware issues, particularly lighting, which is needed for dexternal montitors, but it seems trackpad longevity is also an issue.

Yeah, no. I have 10 year old Mac (damn, 15 year old) that still work perfectly, as well as 7, and 4 year old. Not to mention that Macs, in general, retain their aftersale value better, so that they commonly "experience hardware issues" does not play well.

The reason some people get newer machines every couple or years or so is that the Mac ecosystem favors newer anyway -- you get quite updated specs etc every two iterations (usually the first iteration is speed bump upgrade), and sometimes things work only with the newer models (e.g. a feature that needs the later bluetooth chipset and wont work on the previous bluetooth generation, etc.).


I still have my Mid 2009 Mac book pro. Best Machine ever and still using it. I replaced the CD drive with an SSD and the performance in general of the computer improved a lot. The only issue is that the battery is too old and only last 2 - 3 hours.


Idunno, after running a MBP for almost five years, IMO Windows software is better. Most of it's not very pretty, but in terms of pound-for-pound productivity shortcuts and features-in-general, Windows designers/programmers seem to favor productivity over cheesy gimmicks (or, this contemptible trend of data sparsity)


After I became more focused in the Java and .NET world, the OS started to matter less to me.

Same would apply to most modern programming stacks that come with batteries.

Given that I used to like Windows, before getting into the UNIX world. As it was huge improvement over plain *-DOS for someone with Amiga envy.

Going back wasn't a big deal specially given the Windows office requirements.

I enjoy playing with Macs since the LC days at the university, but always was put off by the hardware combination that was given at their price range.

For me the GPU/CPU combo is quite important.


> I think it is good to look at what the general public will be using.

The general public really isn't using Office 365 or Surface 3 though.


I've not heard of people running IntelliJ on a Surface. I'd be curious to know how well that works.


I run Visual Studio and Web/PhpStorm (essentially built on IntelliJ, I think?) on a Surface Pro 3 often. Mine's an i7 with 8gb RAM. More powerful than the 13" Ultrabook I replaced with it, and much more powerful than the MacBook I was using before the Ultrabook.


I have felt no pain on my i5 Surface Pro 3. It outperforms the decent rig my office provides. I run multiple instances of VS, leave stuff open all the time, massive Chrome tab abuse... The fans only come on while streaming video AND compiling, etc. It doesn't hiccup or grunt.

I wasn't 100% sold on the price/value ratio when I bought it, but now am totally happy I did. Great machine.


With the keyboard, it is just a Windows machine - it should be fine, but thanks: I will do some research.


Running Android Studio and VS2015 side by side with no hardware problems (development-wise) on my Surface Pro 2. (Being honest, Chrome's significantly more of a resource hog than the two combined).

My only problem was an incompatibility with a specific virtualisation technology in the Surface to reduce boot time and Intel HAXM.

There's also a problem with bluetooth lag/packet loss when maxing out the WiFi.

I'm running an Apple Bluetooth keyboard and generic bluetooth mouse when I work from home, Type cover on the go. The Type cover is terrible for the price, but it's an OK keyboard. The touchpad leaves much to be desired.


I developed on Android with IntellIJ IDEA and an awesome Surface Pro 2 for almost a year, until a move to a new job made me switch to a MacBook Air. Amazing, amazing little thing.


I have some complaints on the Surface Pro 2, but mostly related to the TypeCover keyboard. Haven't played with SP3.


Sorry, I've gotta say: you know that things like "SP3" in Microsoft-land typically mean "Service Pack", right? It's so easy to get confused in acronym-city already, shouldn't we avoid creating more? :(


Microsoft is heading in a good direction. I do like Surface 3 and 3 Pro. But the battery life of Windows laptops (including Surface) is still not good enough. We have to admit Apple does well in Macbook's battery life.


If you're looking for battery life, it'd be worth it to look into the thinkpad x250. It claims to have a 20 hour charge, and real world tests [0] stand by that claim.

[0] http://www.wsj.com/articles/the-two-day-laptop-battery-is-he...


Just out of curiosity: is using IntelliJ on the HP Stream a good experience? Considering a similar purchase (chromebook C720p) but afraid of performance issues...


My gripe with Apple is all these Operating system upgrades in iOS and OS X. I understand the need to change, and security is of utmost importance, but don't alienate your customer base with OS's that need hours of tweaking, and then you still have programs that don't work, and missing files. I can't imagine Steve Jobs would be running the ship this way. Are their products still the best-- yes, but the margin is closing in my little world.


> Office 365 is such a good deal for $100/year compared to iCloud

o_O

Was the whole post in a sarcastic tone, or just this part? Serious question.


Office 365 for $100/year is the 5 license pack, that gives you 5 accounts with each 10TB onedrive storage + desktop office licenses + all the office online stuff.

You can get the 5 pack on amazon for $70. I personally think that's way better than iCloud although I mainly bought it for storage, because if you need more than 1TB that becomes by far the cheapest available option.

Not sure how the extra services from iCloud and office365 compare, don't use either all that much, but both seem kinda equal and lacking compared to google or dropbox.

So I'd certainly call that a nice deal.


It is fascinating how much the "micro-transaction" model on the iOS platform continues to bury everything else.

Spiderweb throwing in the towel I would guess has a lot more to do with their #1 point "Competition on the App Store has risen to a frenzied level." than the others.

The top two grossing games - Clash of Clans and Game of War - receive metacritic scores of 74 and 67 respectively.

Whereas the most recent SpiderWeb release I can find - Avernum 2: Crystal Souls - sits at 81

The organic discovery issue is only a small component of the problem when you look at the sheer amount of money that the micro guys can afford to dump into the mobile-advertising economy.

And as anybody who's spent 5 mins on app-store "optimization" knows - the better your paid, the better your organic - by a right mile.

I hate ending comments like this without a solution but the best context we have is watching Zynga get eroded / replaced on FB at some point.

But their death blow was the rise of FB's platform selling ads to the mobile OS guys mentioned above who essentially just out-zynga'd Zynga.

When you create an app economy that provides outsized rewards to the vultures, of course they'll control the board, wonder where this winds up.

(1) http://www.metacritic.com/game/ios/clash-of-clans, http://www.metacritic.com/game/ios/game-of-war---fire-age, http://www.metacritic.com/game/pc/avernum-2-crystal-souls


> And as anybody who's spent 5 mins on app-store "optimization" knows - the better your paid, the better your organic - by a right mile.

Sorry, I guess my English is bad. I do not understand this sentence. Could you explain?


I think a few people missed the mark here (in the sub-comments). The point is, if you're already have paying customers, you will get more. Users looking to download new apps discover them on the Top Paid charts more than anywhere else. The exception is to get featured, but this is a crapshoot. In other words, social proof is the strongest indicator for the long term viability of revenue for iOS apps, and it's hard to be gamed.

So (as others have rightfully pointed out) many app developers solve this issue in their early days by using paid ads...mainly FB ads, as their conversion rate analysis is so ridiculously well engineered. Which means you can theoretically turn off paid search/discovery and rely on on your high position on the Top Paid charts to drive you more even traffic. (Note - Many of them do not turn off paid acquisition channels as they are still highly lucrative anyway)


> the better your paid, the better your organic - by a right mile

I interpreted it as meaning the higher your app is ranked in the 'paid' category (preferrably in the top 20 or higher), the better your organic acquisition rate is going to be (logically because you'll get all the "free" advertising for being featured on the front page of the "Top Paid".

It's a self-reinforcing cycle...more customers, higher ranking, more customers...if you can swing it.


He's saying that the more money you put into paid ads and the more conversions you pay for, the more organic (non-paid: referral, from press, or search, etc.) conversions you'll get.


Yeah, that was tricky. I think he means "the better your paid [search results], the better your organic [search results]." Meaning you can buy improved organic search results.


It means the better your paid install rate (via advertising), then you'll have greater organic installs as a result (usually due to your app spamming enough unwise people on twitter and facebook).

This is a natural result of paid advertising driving installs which the app store uses to rank games in their categories or create recommendations for their landing pages.

Source: I worked at a f2p mobile company.


paid advertising turns into far better organic growth by a very large margin.


> It is fascinating how much the "micro-transaction" model on the iOS platform continues to bury everything else.

It's not only iOS, it's exactly the same thing on Android. Some companies also tried it on Vita and PS3 with games like Rainbow Moon and Destiny of Spirits, but ultimately failed. Interestingly i've also been able to get people to completely drop free-to-pay games by getting them other hardware like PS Vita or Nintendo 3DS.

So basically the only platforms where this happens are the two primary cellphone platforms.


Are we pretending PC isnt a platform?


As far as i know free-to-pay stuff never worked even remotely as well as on cellphones, because on average pc gamers are more experienced and recognize more easily when a skill-based difficulty curve morphs into a payment-based difficulty curve. So i didn't think it worth mentioning.


League of Legends, TF2, Minecraft, Dota 2, Hearthstone, World of Tanks (etc) are all wildly successful F2P PC titles.


How is minecraft free2play or free2pay?

Also, afaik LOL, TF2 and Dota2 sell content as micro transactions, not consumable cheats. I don't know if the same is true for Hearthstone, and i know this is true for WoT, but its lunch is being eaten by War Thunder, which also only sells content.


Hearthstone doesn't sell anything that you can't get by playing the game longer. It's a card game, they sell card packs. You can earn them without spending but it takes a while, and your ability to do that is dependent on how good you are.

I wouldn't say it's pay-to-win, but players aren't on equal footing from day 1 like in Dota2.


I'd love to hear the substantial difference between Clash of Clans selling you gems that make your building finish faster and LOL selling you XP boosts?


Gladly.

First off, i need to clarify how the two work, since they work substantially different to one another:

LOL boosts come in two variants, but the net effect is the same: XP income per time unit played is doubled. You can spend more money to have them apply to more time units played, but you can't increase the effects of the boosts. They do nothing else. (I assume rising in level also pushes one into higher level matches against stronger players.)

Meanwhile CoC gems do a multitude of things: They permit access to items that are otherwise inaccessible and increase competitiveness. They are very powerful multiplicators in allowing constructions times (that i assume to span days in the extreme) to be entirely removed. Lastly, they allow the player to ... play more by restoring heroes.

So, the practical effects are:

LOL XP boosts do not increase competetiveness of the player. In fact, they can detrimentally affect it by pushing a player into high level matches before they themselves earned the necessary experience to function well in them. Further, players are discouraged actively from spending too much on them, since they may be stockpiled, but not increased in power and past a certain amount of stockpiling it's senseless to buy more.

CoC gems increase competitiveness fo players massively, in some cases (build times) applying literally infinitely large multipliers compared to the x2 multiplied of LoL xp. Further they also serve lock the player out of playing the game unless the player spends real money to buy and spend them on healing their hero.

TL;DR:

LoL xp boosts are fixed subscription fees with a maximum cap.

CoC gems are literally consumable cheat codes designed to encourage unlimited spending.


Lol have level cap. So after a while xp boosts are pointless. What they really sell is cosmetics and rune pages


That sure is a lot of words trying to disguise the fundamental truth that there is basically no difference between the effects of LoL boosts and gems. Both advance your summoner/base in exchange for money. They are both time "cheats"that you pay for to avoid a grind in each game.

You say flatly that 'LOL boosts do not increase competitiveness' but this is frankly completely false. An identical clone of myself playing on an account after $300 worth of xp and IP boosts will simply be much stronger than I am with f2p account. Eventually I may catch up but 'eventually' happens in clash of clans too.


LoL is capped at lvl 30. With boost, it will take you 1 week instead of 2 to reach lvl30. It does have any in game effect. Furthermore, they are not an item that you are trained to use as a currency.

The South Park episode about freemium game is very good at explaining how the freemium market model is based on addiction.


> It does have any in game effect.

Right, xp/ip boosts are just hats. New characters you buy are just hats. No in game effect, LoL.


XP boost have no level after lvl 30, which is reached quickly, and is where the real game begins.

IP boost allows you to buy runes quickly, this happens outside the game. Runes have in-game effects, but the difference between T2 (which are extremely easy to buy without spending a dime) and T3 are minor.

I played LoL for years without spending a single penny in it, and I never were under the impression that the game was purposely made boring to force me to buy in-game items.

Freemiums game feels that way. At first, everyting is fun, and they even give you some 'free' currency to hook you on spending, and after a few hour, if you don't pay, the game becomes boring as fuck, because everything take hours.


Basically XP Boost is there for returning players. Experienced players who want to create alternate account or those who move to another server.

It is there for convenience sake but Riot certainly goes out of its way to make creating a viable alternate account rather inconvenient. Being F2P I guess they have to have diverse revenue sources.

All this talk about "cheat" or "increasing competitiveness" is just nonsense.


I may be wrong in stating that XP boosts do not affect competitiveness, but the main point still stands: You only get to advance twice as fast with your 300$ in LoL compared to CoC where you can just blow all of your 300$ in a single day, hour even. There are several orders of magnitude between the two "eventuals" here. Additionally, not having XP boosts in LoL does not block you from playing, which CoC does. And lastly, not buying boosts does not lock any content away from you.

Then again, you haven't discussed in good faith from the get go (which is funny, since I primarily game on pc), so I expect you don't actually want to grok these differences.


I have fun with low XP (the game I'm playing is still there to play). I don't have fun watching buildings build themselves.


The difference here is f2p vs p2w. p2w mean Pay to Win. There are many P2W FPS type games where you can buy weapons that give you an edge over those without those weapons.

In some cases, you can 'unlock' these weapons by earning in game credits, but often the amount of grinding required to get there equates to months of game time.

In DOTA2, LOL, and TF2 the micro-transactions are generally restricted to cosmetic / vanity items. These items offer zero competitive advantage.


did you noted that you said, free-to-pay, on both comments? nice typo, got me a smile.


That's intentional. :)

There are 2 classes of f2p games:

free2play, which sell game content in small units. Characters to use, classes and equipment to unlock, etc. and they also make it possible to gain all of those things in a reasonable manner by simply playing.

free2pay games intentionally abuse the human psyche and subtly push people towards buying things that are at the core consumable cheat codes, and which are not available reasonably through normal gameplay. (They also tune the difficulty curve so this becomes only visible after the player has invested a lot of time into the game.)


I don't like the distinction between things you can get by playing and things you can't. In many games that correlation goes the other way. Unlocking a character provides no benefit, while skipping grind provides benefit.

Especially when it comes to levels; there's nothing at all manipulative about selling level packs.

The more important distinction is between "unlocks more game" and "consumable boost". Keep the message focused.


wow, I never heard of such thing --really interesting.

Also searching the web, I got no relevant result; is this some term you came up with, or are there more info somewhere?


It's a slang word and other people i see online like to use. It's never gone mainstream since understanding the meaning behind it requires actually putting in some time to study psychological effects.


It’s more usually referred to as pay2win


Spiderweb can't compete with the likes of Supercell because their games are niche.

It's an old-school 2D party-based rpg, it looks dated and they recycle the engine year after year to release more games, as their loyal fanbase is more interested in the new storylines and quests than they are in gameplay updates.

I love spiderweb games, but you could spend millions marketing them and they would still be niche, I don't think the discovery issue is the only challenge here; casual management games are the Call of Duty of mobile gaming.


Recycling? More than you might think! The iOS version of Avernum 2 was a port of the desktop version released a few months earlier, which was a remake of the original version of Avernum 2 from around 2001, which was itself a remake of the 1996 "Exile II". The graphics and game engine changed a bit, but the storyline was essentially identical.


Not only are they niche, they are also pretty distinctively not skinner boxes with a nice display. Games at the top often are.


IAPs have destroyed the gaming market on iOS. Funnily enough there's a lot of money being thrown around and a few companies have become rich, but if you're looking for quality it's really really difficult to find something worthwile.

I was hoping that iOS could become a premium platform for games, with titles like Broken sword, The orient express and others. There are a few gems here and there, but most of it is IAP-infested, pay to win. Even if I could in theory play those few good games, I've decided to give up on the iPad, as ultimately those companies won't be able to support themselves and I expect fewer and fewer quality games.


I genuinely hate how apple has pushed microtransactions as the only viable route for app/game demos on their platform. I want to try out an app or game for free, and then decide if I want to purchase the full thing or not. This is how games always worked, and this is how Steam / PS3 / Xbox work.

The infrastructure apple provides however forces demos to live in the same category as microtransaction games, and as a result, the best games either get lost in the sea of free stuff, or completely forgo a demo and just list in the paid section. Quality content doesn't stand a chance.


Apple hasn't pushed this. If anything they are pushing quality paid titles and trying to given them lots of visibility. The market however has spoken.


> Apple hasn't pushed this.

Of course they have, otherwise they would have long ago provided ways for paid apps to offer demos. There are lots of quality apps that would be rescued from the freemium category if Apple provided this as a distribution feature.


>otherwise they would have long ago provided ways for paid apps to offer demos.

Like LITE version and Pro version of an app? Yeah, Why don't aht exist? Oh wait, it does. The devs just choose to make a free app and then have the pro version as in app purchase.

But I guess the devs doing it that way is also Apples fault. Somehow everything is Apples fault according to you people.


No, the goal is to have only _one_ version of an app, with a time limit or limit to the # of times it can launch, etc., which Apple doesn't have any provision for. IAP for content is the next best thing, otherwise you have to maintain two separate apps, which is objectively terrible.


You can have the demo experience you're describing by unlocking the substantial content as a single IAP.

It's not Hollywoods fault we got Transformers 3, it's the 50 million people who bought tickets to Transformers 2.


And now to complete the conversational loop and point out the problems of categorizing demos via IAP, I point you right back to my original comment.


You do realize that a full unlock via IAP is basically the shareware approach?

Just because something has IAP doesn't make it bad. One has to look at how the app uses IAP instead.


How can you possibly reconcile the idea that demos would somehow beat out micro transactions if only they had their own category despite the fact that they fail living side by side with microtransactions?


beat out?

Discoverability is what the person is talking about, and it would indeed improve discoverability.


And the reason we got Transformers 3 is because the 50 million people will buy whats being made available for them and pushed.

Keep in mind that via Spotify's subscription model it's only something like 20% of the what users listen to thats chart music. The rest is outside the normal popular genres.

So it's a chicken and an egg discussion.


No the 50 million people chose to consume Transformers 2 instead of John Carter, they don't just buy anything the studios shove in their face.


They buy what they are presented with and whats considered popular. How can they know up front whats good and bad?

Thats how the movie industry still works that how the music industry used to work.

I bet you Netflix show the same tendency of having more people see non-chart movies than if they had to pay for it.


As to why you're downvoted: HN has become rather eager to downvote. I'd even say toxic, because a few months ago I were downvoted and I knew I deserved it. Nowadays it's because I or someone else has an unpopular (but well reasoned) opinion. It's frustrating, not because I care for my Karma but because I'd like to have a discourse beyond agreeing on the popular opinion.


Yeah. I thought down votes was for things that was factually wrong, trolling or ruining a discussion not for disagreement.

Which is why I really wished down votes came with a requirement to quality why you are being down voted.


He's probably been downvoted because his entire chain of comments boils down to "Hurr durr, people R sheep!"


Funny.

Here I though I was making an argument that when people where given the freedom to choose without it having financial implications they would in fact choose differently than when they have to pay for it and a "mistake" means more.


People think a lot of things. Doesn't mean they're right.


No you would normally look at the evidence or an argument. I have provided some, now it's your turn.


Nice to be downvoted without any argument as to why.


Apple is 100% guilty of this. Of course consumers will prefer free or very cheap software, but what most of them don't realise is that software is STILL very expensive to produce and maintain, so there will be a perverse incentive to extract value in other ways.

The solution is very simple: ban buying gems, virtual dollars and all that crap.


Apple have total control of the appstore, so it is their fault. If you could sideload games - then it would have been different because niche games could use other means of getting on devices (like humble bundle for android)


Apple didn't do this. Companies wanting to make money since so few people actually paid for games up front did this.


Meta critic reviews don't hold much weight for me here, there's a knee jerk mentality in much of the professional press against all freemium games that's justified in many cases but certainly not across the board.

I don't play Clash of Clans or Game of War but I do play Boom Beach from Supercell and I can tell you with zero spent in over a year its far and away better than any other mobile game I've seen and I've bought almost every 4.5 star or better Toucharcade game for iOS.


[deleted]


Supercell is in Finland, an EU country. From my understanding Supercell actually pays much higher than prevailing wages.


Software developers are generally paid quite a bit less in Finland than in the valley (e.g. 5k euros is quite high for a dev there) with a comparable cost of living. It's expensive in Helsinki and you won't find better paid developers anywhere in the world outside of the states.

Not to say the above is correct - it's no sweat shop and the few outliers like supercell and rovio cannot possibly be there because they threw a lot of labor at it. They're quality game companies that hit the market at the right time and knew their audience enough to keep them.


Just to add about the employment costs in Finland: 5000 € gross salary means about 6000 € total cost with payroll-related taxes and mandatory payments included, and about 3250 € net pay per month after taxes.

It's certainly no sweat shop, but I would consider it modest cost if you account for the quality of the developers.


You're absolutely right.

Actually, Scandinavia in general has high quality developers on the cheap compared to the states. This while their quality of life seems on a whole better - seems a win/win to invest in or open satellite offices there :)


And, for good or bad, it is also much easier to both expand/hire and close/downsize in Scandinavia than in other parts of Europe. At least in Denmark and Finland, maybe less so in Sweden. (Norway is really expensive to hire anyone, though).

This flexibility to lay off people is not always a win/win for employees (considering Nokia/ALU merger right now) but I think that investors generally overestimate the bureaucratic difficulties in Scandinavian countries in this respect. Scandinavian countries are very free market economies despite being "socialist".


> Software developers are generally paid quite a bit less in Finland than in the valley

Pretty sure Silicon Valley is an outlier when it comes to wages for software developers...


Supercell is based in Finland, which is an EU member state.


Isn't Supercell Finnish? That counts as EU.


"Clash of Clans is Supercell which is a foreign company that can afford to pay their developers less money in a foreign market. Any US or EU company would have a hard time competing with them."

This really isn't true, as the costs of personnel are usually dwarfed by the costs of marketing.


Aka the cost of getting Kate Upton to run around with her boobs out asking if you want to be her hero.


Hey now, give Kate Upton credit where it's due; she also struts around battlefields and tells people to shoot monsters with catapults. That's hard fucking work.


I own a couple of Spiderweb Software games on Windows, and play them on my Windows 8.1 tablet. They almost play like they were made for a touch screen (and they run great on the ancient dual-core Atom and stupidly non-existent video card in my Thinkpad Tablet 2).

Actually the only thing I need a keyboard for is if I have selected a magic spell and want to cancel casting it. Then I need to pair my Bluetooth keyboard just to hit the escape key.

So, Jeff, let the world know that your games run un-altered on x86-based Windows tablets, even quite old ones. You don't even need to change any of your development processes to target the platform. It's already Windows.


I enjoyed the heck out of developing drivers for that thing—it was a constant uphill battle to even get correctness, nonetheless, the pretty decent performance we got. In the end, my tablet's battery died, and it now sits in a drawer, forgotten.


Drivers for the Tablet 2? Yeah the drivers are awful. Half the drivers I'm using are downloaded from Dell because the Lenovo drivers are useless. Getting to the point where I could have Bluetooth and Wifi connected at the same time was a nightmare. I still use mine, but only when I don't want to start up my PC or something where touch feels more natural, like Hearthstone or Avernum: Escape from the Pit.


The app stores need what Steam has added a touch of lately, other lists curated by anyone that highlight good games. The Appstore top lists are full of bad games and ones I am tired of seeing. You can't hide them and they still show even if you bought them.

Places like Touch Arcade and other curations also highlight lots of great games. There needs to be the ability to make curated lists like a mall within the app stores just like Steam or the actual internet.

The markets and stores should always be open to developer submissions, they should always have top lists of different types and categories, but they also need curated lists of games that you can browse and find within the store. This alone would put more quality games in the top lists that actually have gameplay and are worth it.


Sadly, this will never happen:

All of the bullshit f2p apps out there generate incredible revenue for Apple. Clash of clans alone generates 1M/day. That's 300k/day Apple gets for doing nothing.

Apple would have to give up that free cash.... Not happenimg.


I can't see Apple looking at it as "free cash."

The App Store generates such an insubstantial amount of cash (for Apple, as compared to their other platforms and products) that it would be completely insane for them not to focus on the long-term quality of the store — which would be a driver for their hardware sales.

I am absolutely certain that they realise this, but it just happens to be an incredibly difficult problem to solve without basically rejecting 90% of the apps that get submitted.


> That's 300k/day Apple gets for doing nothing.

Where by nothing you mean "run the App Store".


There already are curated lists on the App Store - the problem is that Apple curates them and they can be hard to get to if they aren't a list Apple is featuring.

I think the closest we've seen to curated lists is the "Indie Spotlight" that Apple has done for the past few months, where they feature an indie dev and list that dev's own games and the dev's favorite games to play. But this list refreshes every month, isn't archived, and seems to be sporadically featured on the main tab of the App Store.

If Apple were to open that ability to a select few approved groups (like Touch Arcade), that would be a huge step forward for Apple, and it would still be nowhere near what Valve has done with Steam's curated lists.


> Competition on the App Store has risen to a frenzied level.

This is a curious argument to make. The platform is too popular? Would it have made any sense at all if anyone made the same claim about, say, Windows game development?

I suspect what this really means is "there are more games than ever, and the iOS App Store is becoming increasingly difficult to rely on as a discovery mechanism, thus requiring advertising to users with other mediums", but that's kind of always been true, and it's certainly been true of all desktop platforms. Which makes me wonder why iOS can't simply be treated the same way non-mobile platforms are and rely on more traditional marketing channels?

The only other way to interpret this that I can think of is "it's becoming increasingly difficult to justify a >$5 price for a game", but I haven't seen any sign of that being true. iOS has had a "race to the bottom" mentality for prices since the app store first opened. There's always been a ton of free or $0.99 games, and a relatively low number of pricer-but-higher-quality games. I don't think this has changed in years, with perhaps the only real difference being that game developers are becoming increasingly sophisticated about how they apply F2P techniques. And as a bit of anecdata, the most impressive game I've seen on iOS in a while is a brand new one called Implosion that costs $9.99 and, at the time of this comment, has a solid 5-star rating with 311 reviews.


The problem isn't that it's popular. The problem is that it's not designed to it's current popularity.

5000apps added every day yet no way for developers to build meaningful relationships with their customers or creating trials make it a very irrational platform to be on.

Its nothing like how Google deals with popularity and quality. It's a huge problem Apple will have to solve. On the mac app store they completely given up.


Apple doesn't necessarily want to solve that problem.

Apple is working hard to commoditize the complements to their hardware [0]; namely the software, games and media that make their hardware useful. Software in the app store is a viciously competitive low-margin market; just like most other commodity markets. Which is why you need to treat it skeptically and work on building a name outside the app store before you are putting money into it.

Look at Twitter as an example. They give away their app to drive usage of the service, and in the early days they were totally OK with other developers making and selling twitter clients. This was because twitter clients are a complement to Twitter the service and their money comes from selling the analytics and advertising that the service enables.

0. http://www.joelonsoftware.com/articles/StrategyLetterV.html


> Apple is working hard to commoditize the complements to their hardware [0]

I've heard this argument before, and if it's true, it's suicidal. Apple will have failed to learn a critical lesson from Microsoft's dominance of the early PC era: it's the apps that matter. Even once there was a significant potential market of would-be platform "switchers" who wanted something beyond what Windows was offering, these switchers remained on Windows because the critical apps in their day-to-day usage (be that Office, CAD software, or whatever) only ran on Windows.

If Apple's view is truly so narrow that they kill the ability of their "complements" to make good money, then Apple is pushing hard to kill the goose that lays the golden eggs. Even if Apple had unquestionably the very best manufacturing processes, hardware design, and platform software design, it would all be worthless if no one other than Apple can afford to write and support high-quality software for that platform.

Now, I'll agree that from Apple's publicly observable positions with its App Stores that they (incredibly) don't seem to understand the need and necessity for maintaining a virtuous cycle between developers and their platforms.

A wild-a-guess: This may stem from a deep misunderstanding of the nature of software. Sure, a random tapping game or timer app or whatever is essentially a replaceable one-off, a fungible commodity. But software that's a fungible commodity fundamentally doesn't create any platform stickyness. If it's really so easy to recreate, it can and will trivially show up on a new platform. This is especially true in a world where many software organizations really are getting better at delivering on multiple platforms.

Beyond the fungible stuff, there's the important category of software that I increasingly view as a "living" thing rather than a static artifact. Such software requires ongoing maintenance and care. This allows it a lifespan across the changes of its underlying platform(s), and to absorb and embody deep problem domains. I pose that these apps, no matter the genre (games, "creative", technical, etc.) are the ones that can create platform stickyness. This, in turn, implies that humans must be able to make a living supporting that software. Undermining this is like cutting off the "oxygen" to a vital part of a platform's ecosystem.


I might be wrong but in the case of Apple it seems that to its customers it's Apple that matters, not apps. It's a brand many people want to buy because of the same mechanisms that make people buy luxury brands. The only time I remember when Apple's branding wasn't really enough has been in the 90s when graphic designers left Macs for Windows machines because Adobe software run so much better there. It was an undeniable combination of hw and sw problems. All of them came back and more thanks to the iPod first and the iPhone later.

So you might be right that too many inconveniences will send faithful customers away (I gave an example) but I don't believe commoditized apps are such a problem. People buy Apple because it's Apple and only a barren app store could drive them away now.


Once upon a time, Apple invested to own 19% of Adobe, which went onto become a powerful software anchor of Apple's hardware ecosystem. Perhaps a bit too powerful for Jobs' liking, hence the modern strategies to commoditize ISVs on iOS.


> the modern strategies to commoditize ISVs on iOS

What modern strategies?

I've seen this theory proposed multiple times, that Apple wants to commoditize software, but I've never seen anyone actually demonstrate ways in which Apple is doing that, just speculation that it would be in Apple's interests.


It's more about what they don't do to help developers make money. A web search will find research papers on software ecosystems, which discuss best practices for mutually-reinforcing, virtuous circle, feedback loops between platforms and developers.

Stardock's 2014 report touches on related topics, http://www.stardock.com/press/CustomerReports/Stardock2014.p...


For not helping developers make money, developers sure do make a lot of money on iOS compared to other mobile platforms.


it's not that they don't make money, but they don't allow developers to operate good businesses. being able to give trials to users, or to handle advertising in certain ways, or to have more flexibility with payments, etc - all of these things would be beneficial for businesses to be able to control directly, but they can't, as they're in the walled garden.

Many are making money right now, but I would suggest they don't have terribly good businesses, in that they don't own the relationship with the end user. One change of Apple's policies can put you out of business.


How about comparing instead to money made by Apple, enabled by iOS developers?


It sounds like you feel Apple owe developers a living?


Apple could attract more developers (and thus more iPad users and more corporate revenue -- reversing declining iPad growth) if developer success was more closely aligned with Apple success.


Attracting more developers does nothing to attract more users.


How about attracting financially successful (not hobbyist) developers who can drive new use cases for the platform?


I think there's lots of evidence for that theory. For example, giving away iLife, a package that contained replacements for many of the most common app types, with every Mac. Also, if you look at their Pro apps, like Logic & Final Cut, Apple has consistently undercut the competition agressively, likely as some kind of loss-leader to sell Macs. For example, I remember when Apple bought eMagic (the developers of Logic) they slashed the price from something like $600 to $200, a shockingly low price at the time. All the other software makers had to respond with "lite" versions to compete in this price bracket.


What makes it hard to compete is the competition.

If some app makers start to be unable to compete, competition will drop, and profits will increase, right?


Yeah yeah. But none of that has anything to do with typing an app's entire name in the search bar and getting a ton of garbage unrelated results. Discovery being broken is not really a "commoditize the complements" play, it's just bad UX.


Dark patterns ("bad UX") and broken discovery are a means to the end of "commoditizing the complements".


Bad UX in an Apple component is not a "dark pattern". It's just, well, bad UX.

Knowing Apple, knowing its history and people that work there and the things they care about, I think Apple is incapable of deliberately producing a bad UX. When it happens, it's caused by other factors rather than being an explicit decision. In the case of App Store discovery being broken, I think it's caused by the fact that the App Store seems to be run largely the way the Music Store is run (with the addition of app review), and nobody complains about search / discovery in the Music Store, but it just so happens that the infrastructure and design around the music store doesn't work quite so well when applied to apps.


Why do you think there is no visible investment/improvement in App Store UX, from the world's premier design proponent?


Because the people that care about design don't really care about the App Store (they care about products instead), and the people that run the App Store don't really have an incentive to try and make dramatic changes in something that, from their perspective, is wildly successful. Change might break things, especially when operating on the scale that Apple is (just think of what the load must be on the App Store).


Apple executive leadership is responsible for successful change in all departments, they could make this a priority if they wanted.


That's true. And if you care about this issue, you could try appealing to Apple executive leadership (emailing tim cook, or running a PR campaign designed to get Apple executive leadership to see your arguments).


Alternately, one could invest in startups that solve this problem.


I don't know how much has come from it yet, but here's some an example of some investment:

"Apple Acquired Search Startup Ottocat To Power The ‘Explore’ Tab In The App Store"

http://techcrunch.com/2015/04/06/ottocat-apple/


Yeah I hear you.

I had a reality check when I launched my mac app. Good thing it doesn't have much competition at least for now.


Trials? How much does a paid game cost; a few bucks? People really want a trial before they make a $1.99 investment? When you go to a restaurant, do you ask for a piece of steak before you actually order it? Do you get a trial of a movie before you buy a ticket? A trailer doesn't count because apps can have trailers too. Why is this trial obsession such a big deal for such cheap products? I personally would love to see the end of the in app purchase model. When I buy a game, I want to have the game and not have to be feeding money into it to unlock stuff. But, the market likes them because those games appear "free."


Thats part of the problem though. Because there are no trials it's forcing down the prices the developers can charge for the games which again make them unsustainable and thus back to creating a very skewed distribution of income from the market 1-9-90 distribution and that's probably being nice to how the distribution goes given the 5000 apps being submitted each and every day.


One can use the shareware approach. Give away some levels and sell additional ones.


> no way for developers to build meaningful relationships with their customers

Isn't that just a nice way of saying 'forcing a meaningful relationship' onto the customer? Obviously users can seek out more information from the developer's website / twitter / etc if they want to. Everyone else probably doesn't want to be nagged by SPAM e-mails and other forms of self-promotion. I feel like trial software falls into the same gimmicky category. Inevitably users wait to purchase until the trial ends and end up feeling annoyed for being forced to pay for something they were using for free. Even more so now that so many users are used to IAP feature upgrades.


I don't think Vogel was saying the platform was too popular for users, but rather too popular for developers, resulting in too much competition.

Besides, if you've ever read his blog, you know hes been self ridiculing his own marketing skills for over a decade, and has a charming way of not caring.


I didn't say "popular for users", I just said "popular". And I strongly expect that "popular for users" and "popular for developers" are extremely closely correlated anyway, at least when it comes to consumer-targeted platforms (i.e. not Linux, where every user is also a developer).

> hes been self ridiculing his own marketing skills for over a decade

There's a difference between "haha I'm not good at marketing, I'm glad people are still buying my stuff" and "haha I'm not good at marketing, therefore I'll stop developing for probably the fastest-growing gaming platform ever and say it's due to competition".


Is the App Store really the fastest-growing, though? It's a ridiculously-dubious claim (in addition to being pretty meaningless without context).

It reminds me of a particular XKCD comic: https://xkcd.com/1102/


> "it's becoming increasingly difficult to justify a >$5 price for a game", but I haven't seen any sign of that being true

But isn't that the crux of it? If you are charging $5 per game and you spend $100k developing then you simplistically need 30k installs just to break even. When the app store was small and you had a good chance of being the only app in your category that was not hard to achieve. When you are just 1 of 100 other apps in your same category it is 100 times harder to achieve. If you could charge $20 or $50 which was historically how software was priced then you could tolerate 4x - 10x less installs, but the app store culture has been permanently set now such that it will not tolerate that. 99c apps got Apple off to a great start, but it has locked in a situation where only the largest (or most sophisticated) developers that can reliably get to the top of the charts can stay afloat.


Whether the marketplace is profitable to a developer at their level is not always directly correlated to how popular that marketplace is. If the the App store is so competitive as to require a lot of up-front money to see a worthwhile return on investment, but other markets do not have that same problem, it may may sense to focus on those other markets.


How do you define "competitive" if you're not defining it in a way that correlates with "popular"?


Competitive is a question of demand versus supply; you can have super competitive small markets.

You have 1 yellow marble, 5 blue marbles, and 10 people. 5 people want a yellow marble, and five want a blue marble. Blue and yellow marbles are just as popular, but the competition for yellow marbles is much more intense.


Given that we're discussing the supply side of things, I think your example is backwards. But even if you changed it to competition for supplying the desire of yellow marbles, I still don't think it's relevant. It's certainly true that there's an incredible supply of cheap/free games on iOS (typically of very low quality, or chock-full of F2P mechanics or ads). But there's a very low supply of high-quality serious games being sold for more-than-a-cup-of-coffee prices. I think there's certainly a market for those games, the problem is just that you can't rely on the App Store for discovery (but you never really could do that anyway). So it seems the real problem is just that Jeff Vogel doesn't know how to market his stuff, and rather than actually figuring it out, or, heck, hiring someone else to do it for him, he's just giving up on the market entirely. Which is a real shame.


You're assuming he doesn't know how to market it. It's entirely possibly he's run the numbers, and found that for the required outlay the expected return isn't worth it, keeping in mind that his time and ability is finite, and that there may be other opportunities that are better investments of his time and money. In fact, he basically says exactly that.


As has been pointed out elsewhere in this comments section, he apparently has already admitted that he's bad at marketing his own stuff.


In the beginning, the App store was popular (people used it), but was not competitive (there weren't a lot of apps, comparatively). Thus, many apps earned quite a bit of money than now because they got a larger percentage of the available consumers.

Even if we assume the App store today had the exact same number of consumers, there's many more apps competing for those users. The marketplace is very competitive, and the percentage of available users is less per app (on average).

My guess is that the reality is that the App store has many more consumers (or consumer dollars which is what we care about), but the number of apps has increased much more rapidly than consumer spending (e.g. supply increased much faster than demand).


The "gold rush" mentality was always a bit of a myth. There were a few apps that really did strike it rich, but the App Store has never really been a place where you could submit an app and watch the money roll in. That's not true now, and that wasn't true when it launched either. It's just a fantasy that a lot of people have.


Whether there was a gold rush or not is irrelevant for my argument. This isn't about being the random popular app, it's about the marketplace, supply and demand.

How many flashlight apps were available 6 years ago? How many are available now? How has the average price changed? Given roughly equivalent ratings, how much research does the user have to do to figure out if one is better for their needs than another? How likely are they to do this? Are the per year proceeds of the average flashlight app higher now or back then? Even if we limit it to apps with a high ratings, how does it look? Are there other app store segments where the outcome is better or worse?


I don't understand your argument. Flashlight apps? There was a brief flurry of flashlight apps, and fart machine apps, and Apple then declared that they would stop accepting trivial clones of existing trivial functionality like flashlights / fart machines. But that's not really relevant, especially as the average price for those apps was generally $0. Those apps existed because they were easy to make, people who made them thought it was fun / funny / a learning exercise / whatever, but it was never a money-making category of apps. Ever.


I thought it was fairly self evident that I used flashlight apps as a stand in for any specific segment you want to isolate in the app store.

That said, here's something to think about: I myself have paid real money for a flashlight application for my smartphone. Why? Because I wanted a flashlight app that didn't take three screens and button presses to get to light, and I also was a little leery of what the ad supported apps were really using the network for, so I wanted one that didn't require network access. This was a few years back and on Android, but I think the point is applicable, what you think you know about a market on a cursory review may not turn out the be correct upon deeper inspection. Some people made money on these apps, either through advertising or through the initial app price.

Flashlight apps may have been a poor choice on my part, but I think the original point stands. How does the market for tower defense games look? What about platform games? Do you think the average developer in these markets markets makes as much as they did 5-6 years ago? Do you think the advertising cost is the same?


I can readily acknowledge that some people made some amount of money on flashlight apps. I expect that research would indicate that it was in fact "very few people" and "very little money" (relatively speaking, at least; I'm sure it's so cheap to develop a flashlight app that practically any money would offset the cost).

As for your original point, I'll grant that you have a point, but I don't necessarily agree with it. I'm finding it hard to speculate about markets where I have no actual data, but my hunch is that the market for tower defense games is larger today than it was 5-6 years ago (i.e. I think the genre has grown more popular). I don't know whether that correlates with increased revenue for developers of tower defense games, or whether the number of games has outstripped the demand for them. Although I think it's fair to expect that there are a few developers making lots of money, and bunch of developers making a moderate amount of money (that may or may not offset the cost), and a bunch of developers making very little money. It turns out the App Store in general has a very steep revenue curve, with most of the money going to relatively few developers. But I think it's always had that curve, we just didn't used to have the data available to see it as well as we do today.

I think the more important question with regards to the App Store is not "how much money are developers making?", but "how easy is it for a new entrant to make money?". And I don't know the answer to this either, but I would be interested to find out.


I think you're right about the revenue curve, I think our whole discussion has been about whether the slope has changed over time. I've heard a lot here from developers that mention that it's harder now than it was, but that's obviously not a rigorous way to go about figuring the truth of the matter.


> This is a curious argument to make. The platform is too popular? Would it have made any sense at all if anyone made the same claim about, say, Windows game development?

I'll make the explicit argument: iOS games are at the decline point of the hype cycle. Irrational amounts of effort have been expended on developing iOS games, both by individual developers bamboozled by a few success stories stories and hoping to strike it rich, and venture investors bamboozled by a few success stories and hoping to strike it rich. This has happened on other platforms - remember the great videogame crash? - perhaps not on windows due to some peculiarities (games there are often ports of successful console games that wouldn't cover their costs on windows alone).

> The only other way to interpret this that I can think of is "it's becoming increasingly difficult to justify a >$5 price for a game", but I haven't seen any sign of that being true. iOS has had a "race to the bottom" mentality for prices since the app store first opened.

Maybe, but the number of developers has kept going up. Imagine 5 people are willing to spend $500 developing a game that would have made $1000 a year ago. Sounds smart, right? But because there's 5 of them their games make $200 each and they each lose $300.

> There's always been a ton of free or $0.99 games, and a relatively low number of pricer-but-higher-quality games. I don't think this has changed in years, with perhaps the only real difference being that game developers are becoming increasingly sophisticated about how they apply F2P techniques.

I think maybe the changes have crept up on you. I'm playing an F2P game at the moment for the first time in a couple of years and the depth of the content and polish is staggering, unbelievably high production values.


> remember the great videogame crash?

No. What are you referring to? I've been playing games all my life and I have no idea what you mean by "the great videogame crash".

> Maybe, but the number of developers has kept going up.

And so have the number of users. And nearly all developers are focused on cheap / F2P games. I would wager that the proportion of high-quality/more-than-a-cup-of-coffee apps has actually gone down over time, which would suggest that competition in that market is lower than ever. If you rely on the App Store for discovery then you have a problem, but that's always been true for everyone except the most successful F2P games.

> I'm playing an F2P game at the moment for the first time in a couple of years and the depth of the content and polish is staggering, unbelievably high production values.

Let me guess, Spirit Lords? I was actually really disappointed when I realized that game was neck-deep in F2P mechanics, because it was surprisingly high-quality otherwise. And yes, there are some other F2P games that are pretty polished. But the vast majority of F2P games do not have the same quality and polish that you'd expect from a game that you had to pay more than a few dollars for.


> I have no idea what you mean by "the great videogame crash"

Probably when the market was flooded with garbage quality games and the bottom fell out of it[0].

Fun fact - the reason the NES looked like a nondescript square box in the US with the weird spring loading cartridge (meant to resemble the mechanism of a VCR) as opposed to looking like the Famicom, was fear that after the crash, Americans wouldn't spend money on a 'videogame console' but they might buy something presented as an 'entertainment system' that just happened to play video games.

[0] https://en.wikipedia.org/wiki/North_American_video_game_cras...


> 1983

Ah, that's why I'm not familiar with it. That was a long time ago. I'm really only familiar with everything from the NES onwards.


You're probably too young (so am I). He's talking about the crash that wiped out Atari and others in the home market. Nintendo had to market themselves as an accessory for a toy robot just to get U.S. retailers to stock them. The single biggest example is the E.T. game, which had thousands of unsold copies shipped straight to landfills.


> Would it have made any sense at all if anyone made the same claim about, say, Windows game development?

In Windows game development you've have a long history of many ways to distribute your wares, with different pros and cons. Not so much for Androids or iOS. Note that similar complaints have been levelled at Steam, which is the closest thing to the iOS model.


Is there really that much of a difference between "Click [here](link-to-app-store) to buy this game!" and "Click [here](link-to-self-hosted-page) to buy this game!"? The only real difference I can think of is when software was primarily distributed via physical media, but that hasn't been the primary distribution mechanism for a while (and I would expect that nearly all software these days doesn't even have an option for physical media).


There are massive differences. For example, in the Windows world you are allowed to find out who likes your stuff and build relationships with them.


Relationships? What are you talking about? I can't recall the last time I ever felt I had a "relationship" with a game developer, on _any_ platform. The only way I can meaningfully interpret your claim is by replacing "build relationships with them" to "and market directly to them", i.e. being able to email your customers with marketing messages. And I'm glad Apple doesn't give my email address to everyone I buy an app from.

The only other interpretation I can think of is "Apple doesn't let you respond to reviews on the app store", but the ability to respond to reviews is not an industry-wide expectation. I can't think of anywhere before the rise of mobile gaming that anyone considered the idea that a game developer should be able to respond to reviews.


> I can't recall the last time I ever felt I had a "relationship" with a game developer, on _any_ platform.

The original post here was on Spiderweb Software's forums, which have been around in some incarnation or another for over a decade and hosts a community that has built up around their games. This is a company that has always relied on a close relationship with its customers.


I would think jblow knows what he's talking about here, he's delivered at least one great game to the world (to me, Braid had the perfect difficulty ramp and an awesome level of inventiveness).

As a buyer on Steam actually I didn't feel I had a direct relationship with jblow as the developer, but I would have been very happy to receive his marketing material, especially if it was programming tips!

http://www.quora.com/What-are-the-5-tips-of-a-productive-dev...

(...looking forward to The Witness)


Compare and contrast the experience of buying an iPhone with the experience of buying an iPhone app.

When you buy an app, you are basically looking at sales sheets from people you don't know making claims you can't verify. The only way out is to go find reviews about the apps from blogs you can't trust.

Now think about buying an iPhone. I've owned an iPhone for 8 years. I know intimately the support experience. I know intimately the build quality. Software quality. Product lifecycle. When a new iPhone comes out I have basically all the information I need to make a purchasing decision, without ever laying eyes on the actual product. People who don't have that information can go to a retail store where they can try the product and talk with knowledgeable and friendly people about it, and many of them turn into me in 8 years.

I want to be able to create that kind of experience with my products. Because if smartphones were sold the way that apps are today I don't think many people would buy them.


If you sold your app for $600 I bet you could create that kind of experience. And if Apple sold phones for $2 you can bet the buying experience would be a lot different.


I completely disagree.


It's not about giving your email address to everyone (I wouldn't want that either), but it's about having the friction-free option of doing so. There are a number of game-makers for whom I absolutely want to be emailed when they make something new, because I love their stuff. That doesn't happen on iOS very easily.


Why not? Every app page has a link to the developer's site, and the developer's site can offer a way to subscribe to some sort of newsletter. And games often provide links in the game itself to view information about the game developer.

Edit: I recognize that it's not particularly common for developers to try and encourage users to sign up for some sort of marketing. But I think that's just because developers haven't really considered it to be an acceptable thing to ask users of their games for, rather than any actual difficulty in doing so.


In the ~17 years I've been involved in the Descent community online, I've had conversations with the developers of Descent 1, Descent 2, Descent 3, a project that was going to become Descent 4 until Interplay pulled support, and Descent:Underground (which just successfully kickstarted last week.) In particular, the Descent:Underground group is currently improving parts of their game based on conversations with me and others who play the older games.


I'm not sure what your point is. Are you trying to argue that it's impossible for iOS developers to form communities where they can talk to their users? Because that's obviously false. Are you trying to argue that PC game developers got such communities "for free"? Because that's also obviously false.


I'm addressing your comment that "form relationships" can only be meaningfully interpreted as something other than "relationships", by noting that I have formed actual worthwhile relationships with game developers in the past and present.

My comment does not address the rest of what you're saying (in particular, I don't care at all about iOS games or their app store), only that one point.


> Relationships? What are you talking about? I can't recall the last time I ever felt I had a "relationship" with a game developer, on _any_ platform.

iD/John Carmack, David Braben, Sid Meier, Rockstar, Volition, Gearbox Software... I could keep rattling off names and developers, but those are all examples of companies and individuals who have dedicated followings who follow them from game to game or platform to platform.


Yeah, you can follow companies, but "relationship" generally implies that it's reciprocal. A one-way "relationship" is called "stalking" ;) I don't really see why the App Store affects the ability for customers to choose to follow their favorite developers.


Because it commoditizes the product. Apple wants apps to be fungible and rapidly consumed and thrown away, and that does not lead to a real relationship with a producer. You have no reason to form a relationship with the provider of something that is placed, and consumed, on the same level as a Snickers bar.

Games on PCs, Valve's best efforts notwithstanding, are not yet so reduced.


There's plenty of incentive for Apple to make apps plentiful and easy to acquire, but I strongly disagree that there's any incentive for Apple to want apps to be fungible. Why would there be? This is a claim I've seen repeated a few times without any supporting evidence. And I can't think of any reason why Apple would want apps to be fungible (more-so than any other software platform, at least; there is an incentive for there to be competition within any given category, but that's not the same thing as having the apps actually be fungible).

There are a lot of people who like having tons of new games available to try for a few minutes or hours or days, and then throw them away. But the existence of those people / that market does not mean that there isn't also a market for games that people stick with for a long time, and that are sold and maintained for a long time.


> Yeah, you can follow companies, but "relationship" generally implies that it's reciprocal.

iD and Bungie both have a long history of heavy interaction with their customers. That's rolled off for iD, but Bungie's back-and-forward seems as lively as ever.


I'm still unclear on what part of the App Store makes it impossible for companies to engage with their customers. I've played many Bungie games, but nowhere in the purchase process for any of their games was I encouraged to form a relationship with the company. Any back-and-forth with companies happens in channels other than the retail channel; twitter, forums, websites, etc. I don't understand why selling on the App Store vs, say, PSN Store, or Xbox Live Arcade, or any of the numerous Windows PC retail channels, affects the ability to engage with customers in this fashion.


> I'm still unclear

Given you're fighting with actual, successful game developers as well as adopting a take-em-all-on style with everyone else I'm not sure you want to be clear.


> I can't recall the last time I ever felt I had a "relationship" with a game developer, on _any_ platform.

It was common many years ago. And some indie developers still engage their fans, solicit advice, etc...


I am talking about Wasteland 2, Pillars of Eternity, Broken age that happened ...

It is impossible on the AppStore ...


It's impossible to build a community of users if you publish on the App Store? That's nonsense. All 3 of those games are by known developers with a pre-existing fan base and all 3 have user communities that are distinct from their retail channels. There is no reason to believe that the retail channel must provide a user community in order for a user community to exist.


But you cannot kickstart an ios game because you don't know if you can deliver. Due to apple censorship.


Huh? There have been multiple successful kickstarted iOS games. The most high-profile is probably Republique (https://www.kickstarter.com/projects/486250632/republique-by...). And I have no idea what you mean by "due to apple censorship".


jblow: So follow them on Twitter? Go to their website? Seriously, you're acting like a developer can never see the outside world after they sign the developer agreement.


It's about how much friction is imposed on what. As anyone designing a "funnel" for web purchasing (or whatever) will tell you, a little friction goes a LONG way.

Yes, of course you can go out of your way to get connected to the people making a game. But it's just harder to do that on iOS than on Windows, and this has consequences in terms of the viability of these platforms for small developers. (It is by no means the only factor. The race-to-zero pricing on iOS is probably a bigger factor.)


You mean iOS developers aren't allowed to do things like have Twitter accounts, Facebook pages, or host forums?


> The only real difference I can think of is when software was primarily distributed via physical media, but that hasn't been the primary distribution mechanism for a while

It was a major distribution medium until fairly recently, even if not primary. Heck, you can still go into lots of retailers and buy physical media for PC games.


For some games, yes. But not for most of them.


So? That's not really relevant. You still have insane amounts of competition on Windows, much like you do on iOS. On either platform, marketing is a requirement.


[deleted]


> to be discoverable one has to pay the advertisement dollars to be displayed in the top spots

That's not how the iOS App Store works. You cannot pay Apple to be featured. The closest approximation you could do is pay one of those companies that try and game the ranking system by getting "fake" downloads (e.g. paying people to "buy" the game, etc) in order to get placed highly on the top charts, but that's not the same thing as being featured on the store, and it's widely regarded as a scummy move.

In any case, if you had to self-host your own game download instead of having Apple provide the download, that certainly wouldn't fix any of this. People still need to discover your game, which means you need to market it, and you can provide a link to the app store just as easily as you could provide a link to a self-hosted download.


[deleted]


I don't know what you mean.

Every "featured app" on the App Store was chosen by Apple. Nobody can pay to be featured. Apple decides what apps they want to feature on their own.


Oh, ok then, i was entirely wrong. I thought it operated like Steam. (I don't own an iDevice.)


Windows ecosystem had level playing field for each player. Appstore has rankings (which feeds into itself, which leads to developers having to pay people to download just to rank higher initially) and featured (subjective and non-transparent curation by the platform owner)


You keep saying iOS when the article isn't about iOS.

It's about developing games for the iPad.

Not iPhone.

And making the assumption that they are the same ignore realities even Apple is aware of. One could very easily find value is developing for iPhone and not iPad. And clearly, developing for iPad alone wasn't worth it.


I follow Ipad gaming quite closely because it is perfect for my commutes. I would consider myself as close to a hardcore ipad gamer as it is possible to get - I read specialist sites like pockettactics and spend well over a hundred dollars a year on ipad games. I had never heard of this company or its games before.

The ipad is a marketing platform in the best and worst way I can mean that. Looking at this company they seem aggressively anti-marketing. Their entire presentation style and format is opposed to the kind of experience the ipad offers. They are the anti-flipboard. The fact that someone as interested in ipad gaming as me was not even aware their games existed until now suggests it is probably not the best platform for them at all.


Spiderweb games are very highly regarded RPGs that came from the PC. I definitely know TouchArcade has covered their games a few times over the years.


I looked up the company on pocket tactics, the site I menioned after I posted this. Infact they have had numerous mentions there as well, but no real detailed coverage, no real follow up. I don't think iPad games journalists are failing to cover companies like this - I think these kinds of companies are so anti-marketing in the presentation and design of their games that they fail to stand out or get noticed even by people like me who might have been looking for this kind of experience.

A contrasting company would be someone like inkle's 80 days - that's was an odd iPad game that was hard to sell just on screenshots, but it was marketed brilliantly and had a vision throughout the product that was reflected in taht and the buzz around it was and is huge because of that


You are saying "anti-marketing" but what this really means is that they are putting less investment into production values than you prefer.

The whole point is that it is doubtful that they will stay solvent if they put more money into production values on that platform.

Nice graphics are very expensive. (They are more expensive than any other aspect of game development, in fact). If it seems unlikely that enough people will buy their game given that type of investment, then it may be a good choice to stay away from that.

On top of which, maybe they just don't want to spend all their time doing graphics. Maybe they want to work on the story / world / etc.


> Nice graphics are very expensive.

Alarmingly true. Good 2D graphics for a game like Avernum could easily scale into the tens of thousands, if not hundreds of thousands, of dollars.

(Source: my spreadsheets. :-/ )


I would consider production values part of marketing (a vital part) - they are competing only on PC without the marketing/production values being an issue - iPad is not a platform they can do that with. I don'et think they are necessarily wrong to step away from th iPad from a business perspective, just saying very different audience customer retention strategy and maybe not suited to them.


No, he's saying "anti-marketing" to mean just that: marketing.

It doesn't matter how much work you want to put into your graphics, story, world, or whatever if no one knows that your game exists.


Do you think there is still the chance for a well-made, expensive up front iPad game (we are talking like $10 to even $30) to make it based on word of mouth from hardcore gamers and Gamespot/IGN/etc. ? I know that the ecosystem itself optimizes f2p, but still.


Games which target iPad first inevitably make design and marketing compromises that harm their other platform performance & experience - whereas games which target other platforms then are adapted can typical make far less painful and less compromising changes (which can work against them or for them).

Yes, it is possible but not easy - look at for example Shenandaoh's Desert Fox / Battle of The Bulge). But I don't know if that is a long term model for success. $19.99 is the highest price point you can set for a game and then you are competing with things like FMC2015 or Xcom (heck, you can get GTA San andreas for $6 or so - is your game better than GTA?!?) which are full PC ports. Clever developers will I think make a game for more established platforms, and if successful port to iPad for additional revenue. The iPad has millions of active premium gamers, but a tiny number compared to any non-portable system.


Well I just bought space program manager, got many more before and will pay ten bucks for prison architect, even more for tiny trek or if ever rimword would get a porting

Fact is all iOS categories and ratings are useless to me, mostly rubbish game built for freemium and with loads of marketing wind behind get visibility on the store, while real games get buried and I've had to follow third party website just to even know their existence


App-store-sarcasm-on: Oh, the curated library of applications and games is failing? You have to resort to 3rd party sites to actually filter things?

At this point App Stores are nothing more than extremely expensive anti viruses for all the applications installed. We'll see on the long term if giving up so much control is worth it for a security scan :|


given the many malicious app that pass trough, I guess not.

anyway, even steam has given up, but at least they added curators before opening the early access floodgates


Would you think there is still a chance for iPad games that are well-made and well-advertised on the internet, that cost perhaps over $10, maybe even $20, to do well?

I see this as, first you have the stuff that's top ranking on the app store which is mostly f2p optimization problems. But hardcore gamers should be able to spread the word about an actual good ipad game, right? Or is the notion of an ipad game itself not hardcore enougH?


I'm interested to know what changed in 8.3 that broke their game engine. It's supposed to be a minor update, after all.


A quick Google search led me to this thread: http://forums.toucharcade.com/showthread.php?t=259228

There are a few reports there of games on Unreal Engine 3 having graphical glitches and performance issues under 8.3. It looks like Unreal Engine 3 is not getting updates anymore[0], so it's possible that UE3 users are stuck. (I don't know if Spiderweb use UE3).

[0] https://www.unrealengine.com/previous-versions


I would be very, very surprised if they were using UE3, considering their games haven't seen much of a graphical update since... 1994? Really old school in literally every single way. Think Ultima Online, then take a step back from that.

However, I wouldn't be surprised if they suffered from a similar regression.


Could be tied to the forced 64bit support for iOS apps and updates from June 1st? https://developer.apple.com/news/?id=12172014b


64 bit stuff in iOS 8.3 broke the way we were using 3rd party zip libraries in my iPad project. Before that, it had been completely stable since iOS6. Am now rewriting in Swift and using Storyboard, with xcassets taking the place of zip files.


That will orphan apps who just want to deliver updates (e.g. app security, unrelated to OS) for older versions of iOS. Orphan apps will need a distribution channel that can reach older and still-useful devices.


Honestly, Apple sometimes tends to break some fairly important stuff even in the bugfix x.x.1 releases, especially in iOS 7/8 versions which have been... slightly more buggy that we'd like.


More details:

http://spiderwebforums.ipbhost.com/index.php?/topic/21507-av...

https://twitter.com/spiderwebsoft/status/588809965499850752

The tweet states "The deprecation of UIApplication and UIViewController."


So am I. I'm also interested to know why the engine can't be fixed, rather than being forced to replace it with an all-new engine.


It might be technically possible or even trivial to fix the engine, but impossible for other reasons.

They are talking of licensing a new engine. If they licensed the current one, too, it might just be a matter of a vendor that went out of business or killed the product.


Not the engine author, but from our app experience:

Tracking Apple's breakage (in minor releases, even) and fixing our code -- code that was written to the defined API -- is a cost that is rising precipitously as Apple's quality and stability keeps going down.

If you have revenue from other platforms, it can make sense to just say "f it" and drop the platform. The AppStore is a shitty retail pipeline; 30% is a crazy commission for digital software distribution, the single-retailer-monopoly + appstore design is driving prices to unsustainable levels, etc.


If you're looking for a company to be the poster child of semantic versioning, Apple ain't it :)


If the engine was hybrid/webview based, that has been a mess since iOS 8 came out.



That is one of the reasons, I avoid Apple as much I can. The hardware is really good and also the software often is. But the company politics just kills it off for me.

They just don't care about the small developer (and sometimes not even about the customers). When they have an idea, how the world should be, they change everything needed and don't care about the effects on small software vendors or even some customers.

Apple once made this genius tv spot about 1984. Apple for me has become, what they accused IBM to be at 1984. The crazy thing is, that the normal Apple customer does not care.


I teach both iOS/UIKit and OS X/AppKit in some of my courses and I have to rewrite my solutions every year. So much is deprecated and so much changes. If you write mildly sophisticated apps for iOS, just prepare to rewrite them every year.

Just look at the evolution of memory management from retain/release, to gc, to ARC, throw in properties, etc... Not to mention the API's that are constantly deprecated. Even my model (non-IU) classes get overhauled.

In the end the apps are doing anything better...


What would you recommend they do then? Not evolve the platform? Retain deprecated methods?

I don't get it.


Good retort. They certainly should push it forward. The continual overhaul is tiresome though... Now they have an entire new language! Not only will I rewrite all my solutions next year, I'll have to rewrite my notes from scratch and learn yet another language.

If all I did was develop iOS/OS X apps I would enjoy the continual improvements, but its tough to keep up with when you also have to keep up with a variety of other technologies (like OpenGL! -- which I have used since Iris GL).

The fact remains, rewrite your iOS apps every year.

Of course this pales compared to the crazy ADHD pace of JS frameworks -- that is truly beyond reason.


> 2. Changes in iOS 8.3 completely broke the engine we have been using for the last several years.

This. Apple has been a specialist at breaking compatibility. Microsoft has been much better at that. Backward compatibility is so important, not having it is pretty risky if you want to keep mac developers.

XCode is a good example of that. At every new XCode versions, things have to be reconfigured.


That's too bad, but I do understand the sentiment. Ive been playing Spiderweb software games since I was a kid, and did in fact buy one of the Avernum games for my iPad, which I quite enjoyed (though the graphics were a bit to small, and hard to click on in some places).

Still a great source of classic RPG gameplay.


Honest question: how/why is it legal for Apple to ban any other "app store" from existing? I understand the motivation to require the presence of an intermediary to combat malicious software, but it seems blatantly anti-competitive to forbid competition within the domain of providing this intermediation service.

Similarly, why haven't any major app companies challenged Apple on the 30% app store charge, and how it is inconsistently enforced? For example, rdio charges a premium if you purchase a subscription through the app store, but MLB radio does not.

I don't know much about these things, but as a user I think it's fair to conclude that Apple has not done a very good job of app store stewardship, and I personally would welcome potential new overlords.


They don't have to ban other app stores. All they have to do is require that apps that run on your device be signed by Apple's certificates... which requires the App Store ecosystem.

If you jailbreak device to be able to run unsigned code, of course this restriction goes away and you can use whatever app store you want... like Cydia, for instance.


It is illegal only when Apple owns the majority of the market share, that is, a monopoly. When iPhone users constituted more than 90% of all smart phone users, such practice would be illegal. Right now, Apple can argue that a tight controlled App Store is vital to their success, and any consumer who disagrees can simply switch to their competitor, Android.


> Competition on the App Store has risen to a frenzied level. As a result, sales for our games has dropped massively and the cost of advertising them has shot through the roof.

Indeed. It seems nowadays, with Apple and Google, that the company with the deepest pockets for advertisement takes it all.

Money is screwing up the evolutionary game that capitalism is trying to be.


You know something is wrong when I find myself not only agreeing with everything the author writes, but when I'm also fearful to publicly state that I agree for the concern that it might hurt my own app store submissions if big brother notices.

Every iOS update breaks our app, in stupidly unexplainable ways. It takes months of work just to maintain a steady state between app store versions, and I feel that the magnitude of work required to keep up with iOS 7 and now iOS 8 was much worse than going from 5 to 6.

Android is not much better, but I feel that iOS is slowly getting worse and it makes me sad.

Edit: Spelling


Sort of "meta," but I am getting the feeling that the whole "mobile is the future everything else is dead make everything for phones!" era is going the way of "web/cloud is the future everything else is dead write everything for the browser!"

Obviously mobile ain't going anywhere any more than the web is... I am referring to the mobile-only and everything else is dead mania. I feel like it has maybe a year or two left in it.

Question is: what is the next thing that is "the future" and will make everything else "dead"? Internet of things?


I really miss native applications on the desktop. I want to use something responsive, snappy and can work offline. Almost no productivity app beyond the usual suspects do this anymore; it's all web or mobile based. I'm fine with cloud syncing just give me my native apps on the desktop that I miss!


The enterprise software "Social software is the Future" meme also thankfully seems to be dying down, but not before untold billions were invested in Yammer, IBM, Salesforce and others trying to clone Facebook and turn it into a business product.


Visible battlefield casualties tend to scare inbound bright-eyed-and-bushy-tailed recruits, until recruiters invest in cleanup crews.


Apple isn't changing, they've always wrested control of their platform from developers. This approach is obviously working for them, this time next year we'll all be talking about "watch first web development."

I read the article and while Jeff made valid points, I wonder if he would've benefited from having a PR/marketing person vet his messaging. His article can easily be dismissed as hyperventilating about change and as a reader it's unclear what action I should take after absorbing his point of view.


> this time next year we'll all be talking about "watch first web development."

This doesn't diminish your overall point, but a little side note - Apple Watch has no browser, and it'll never have one.


This is a single twist of fate. It has no meaning except for the individual. He just doesn't want to keep up any more. His decision. But it has nothing to do with Apple.


Nothing to do with Apple? Since iOS7, Apple is constantly breaking apps that are already in the store and running fine. This is because they don't provide a backwards compatible runtime. For example, your app that's compiled and working well for iOS 8.2 can have new bugs (like text fields not appearing) after the user upgrades to iOS 8.3. To re-iterate, the user has not re-installed your app. They are still using the same great app compiled for 8.2 and now it doesn't work correctly. And the bug is caused by bad code from Apple in their runtime. To use an ironic comparison, if you made a Flash game in Flash 9, it would always run exactly the same way until the end up time, even after Flash 13 is released, because Adobe/Macromedia kept the runtime for each version and did not change it after it was released. New version meant a new runtime engine.

Anyway, so Apple breaks your app. It's possible you don't notice this because you don't test every feature of every app you've created for every new iOS version on the day it comes out. So within a couple weeks, most of your clients are asking you why your app doesn't work and you have to scramble to replicate the problem and then someone how fix it, sometimes having to rewrite a core part of your app. This happens over and over and it's tiring, costs your company money, makes you look bad to your clients and is very demoralizing. This is no one's fault but Apple's.


But it has nothing to do with Apple.

On the contrary. Part of the problem is Apple's backward compatibility, or relative lack thereof.

Apple's API stability is pure cowboy. I don't think we should pretend otherwise, just because we like their products.


> Competition on the App Store has risen to a frenzied level.

I'm portuguese. I've an Ipad but I never bought any app. The market is so full of good free apps that I never needed to buy any. Most of my friends don't spend much in apps also. This makes me think that App store gets money only from very specific countries. Will App Store become like Google Store as soon as these countries get used to get apps for free?


Curious why a developer of low-tech-requirements games like these wouldn't use something like Unity and eliminate most of the multi-platform hurdles.


From the screenshots it appears that Unity would be a great choice. Cocods2dx would probably be a good choice too. But the post specifically mentions the costs of licensing & learning a new engine as one of the reasons. Some people just don't want to learn new stuff.


He has been refining his games and engines using the same software since the 90s.


I think Unity would be overkill for the types of games he makes.


There was a story on here a month or two ago interviewing the founder of the company. This is basically a one-man shop, and from the few games I have played in their offerings, they look like a little more work to put together than Flappy Bird.

If it's too much of a hassle for the return he's getting on it, then it makes perfect sense to refocus efforts.


Understandably, most comments are gonna focus on App Store discovery, etc.

But one thing that stands out for me is a reliance on 3rd party frameworks and a lack of desire to learn new technology.

The problem with a framework is that when things break they can be a nightmare to fix. Work closer to the system frameworks and you face a steeper learning curve, but you gain confidence for when things go wrong.

As far as ever-evolving technology -- well, maybe it can feel daunting, but it's also exciting and fun.


Because it was a bad idea to split the developer ecosystem in two destkop/mobile parts from the start.

My ideal computer before iPads existed was a convertible 2-in-1 Macbook Air. It will never be manufactured, and therefore, I will never be my main machine.

Ubuntu and Windows are working in unifying them again. Let's see what happens in the future.

I'm writing this in a Dell Inspiron 7000 2-in-1 running Ubuntu, the closest thing I could find to my dream machine.


i agree with a lot of the moaning about competition etc, but the tech argument is basically an admission of incompetence. the complaint is that "you used rubbish tech and didn't plan for the future"? thats not the fault of apple or ipad...

poor workmen blame their tools.

its funny how my hand rolled stuff rarely hits these problems and is easy to fix. but then years and years of working on every major gaming platform helps make you forward think these things properly...

as far as a platform to adapt to goes ios is on the very easy end of the spectrum. you want to see what microsoft have been trying to do... at least apple let us do native code with only minimal layers of their crap, and don't break too heavily the standard libraries that have served us so well for the last 40 years. android is almost as bad... i'm convinced google want to kill native code for whatever insane philosophical reason they have


The first and main reason they give:

"1. Competition on the App Store has risen to a frenzied level. As a result, sales for our games has dropped massively and the cost of advertising them has shot through the roof."

Probably had the idea that mere hard work in creating a game is rewarded in the market (any market).


Only one of those reasons is legit - you need a marketing budget to win at games on iOS.

The rest? Weak excuses imho.


I wonder if they're still developing for Android, the site and article seem to imply so.


The politics around software updates (both OSs & apps) in recent time show that whoever builds on Apple is building on quicksand.

Overall software quality began to deteriorate visibly with the introduction of OS X and hasn't gotten better in the meantime. This is not because the bugs would be hard to find, or fix, but due to a deliberate policy change. And iOS is no different. Just look at both AppStores, you can see the middle finger sticking out.

They changed their stance from “We care about users” to “My way or the highway.” This works as long as the current fad is all the rage, but when it's gone there will be nothing left.

Apple has gotten too stupid for their own good, i.e. a company like any other.


> My brain no longer has the time and energy to deal with Apple forcing me to relearn how to program every few years.

Is this just kind of how developing software works on every platform?


Not at all. Most platforms are slow to add new features, and even slower to remove them. This is why we can still play videogames from 10+ years ago on modern Windows systems.

Has this held Windows back? Yes, a bit. But I for one appreciate that I don't have to buy all new software every year for it.


And even in cases where compatibility is broken, there's still the possibility on Windows or GNU/Linux or even OS X to fire up something along the lines of DOSBox or VirtualBox or somesuch and be able to recreate the necessary environment to play 15, 20, even 30-year-old games. Last I checked (which was admittedly a long time ago, seeing as my ship-jump from iOS to Android was a few years ago), you have to jailbreak an iOS device in order to achieve similar functionality thanks to Apple's rather-draconian restrictions on such things.

Apple's attitude toward iOS - move fast and break things - is in stark contrast to even OS X, where a lot of software made for OS X versions as old as 10.5 will still run relatively happily (though apparently Yosemite broke quite a few things).

That said, there's a time and a place. On my servers, for example (which are typically running OpenBSD or some GNU/Linux distro), I'm at least somewhat-okay with backwards-compatibility breaking (within reason) if it means having better performance or security. In that context, I'm relying far less on closed-source software (if I'm using closed-source software at all, which is incredibly rare for me when it comes to servers), which means that most of the software I actually rely on will at the very least usually have community-supported patches up the wazoo to support those backwards-compatibility-breaking changes (like OpenBSD's switch to a 64-bit time_t, which basically just amounted to recompiling a bunch of stuff - most of which was already done by the ports tree maintainers when creating binary packages).


Well, compared to Windows, which up until Windows 7 let you run a bunch of very old games (and even then you could still get some of them to work in compatibility mode), I'd say no.

And that's a software platform that has to run on a bunch of different hardware.

Also, your Xbox 360 games will all still work on your Xbox 360, as long as you have the hardware around. AFAIK all the Xbox 360 software updates never broke any Xbox 360 game.


To back this up, Spiderweb's original Exile series still works perfectly well on modern Windows machines. I dug up my disk for Blades of Exile a while back and spent some time playing it on my Linux machine without anything other than some very minor graphical glitches. The Mac versions haven't been playable without an emulator for a very long time. And in this case, we're talking about the most recent iteration of SW's engine (they seem to remake Exile with a more modern engine and refreshed graphics every so often; the current Avernum 2 is the second remake of Exile II). Software platform vendors should really make an effort to maintain backwards compatibility for at least a few years -- especially when the platform is so centrally controlled.


Up until recently, consoles have been a bit of a weird case, since "game" in that context usually meant something more like "game plus a baked-in operating system communicating directly with the hardware". The Xbox360 and PS3 were among the first major systems (outside of the PC realm) to run games on top of an existing operating system. Even somewhat-recent consoles that ran proper operating systems (original Xbox, Wii, PS2, Gamecube IIRC) treated them more like really complicated firmware/bootloaders; execution would be transferred to the game itself even as late as the Wii (where each game would implement certain things like the Wii home-button menu individually).

In such cases, most console games were thus mostly protected from software updates on the consoles themselves, but this also made it much harder for developers to patch their games after release (they'd basically have to rerelease or otherwise introduce fixed copies of the game alongside broken ones). On the plus side, it meant that the only time one would likely have to learn a new programming method/framework/etc. was when programming for a new platform.


The baked-in operating system thing is especially visible with Nintendo DS games, where the network stack and configuration UI was shipped on the cartridge. Wifi has to be configured in each individual game, and it only supports WEP, even if the game is run on a newer handheld.


>> and even then you could still get some of them to work in compatibility mode

Barring that, you still have the option of spinning up a VM running an older version of Windows to play those really old games.

I actually have to use a VM to play some recent games (in my particular case, Street Fighter X Tekken, Skullgirls, Blazblue Calamity Trigger), because NVidia's Optimus tech seems to prevent some games from running on my Lenovo Y50.

Fortunately VMWare virtualizes the GPU workload quite well. I was surprised that I could easily get >60FPS at 1080p with Street Fighter X Tekken inside a VM.


He's developing isometric role-playing games. Where you can probably happily get by with the same stuff you used 20 years ago (and probably older, minus most Abrashian optimization and hardware trickery).


No. Overall complexity is going through the roof across the compute markets. Microservices, game engines cryptocurrency. Better hold on to something.


Did someone invent a whole new language for their platform and solely for that platform like Swift?


Nobody is forcing you to use Swift.


.NET



Hugged to death?


To sum up:

* "we have competition and that's hard"

* "writing code is hard"

* "no srsly, it's really hard yo"


I expect this may happen to Apple Watch sooner rather than later.


WebGL


Don't really understand. I searched out YouTube videos of his games, they are very low visual quality graphics RPGs. If his engine is dead, he could make something better anyway by just taking a week out to watch Unity tutorials. And that is without even buying one of the many RPG kits off the asset store.


Back in the time-frame of the Leopard to Snow Leopard transition, I visited an Apple Store (for the last time), where upon I informed a 'genius' that on Windows, I had code that I needed to (or was required to) run everyday that was older than him, and that Apple's complete disregard for backward compatibility lost me (and everyone I know) as a customer for ever.


The image of you entering a likely-crowded Apple store and berating a random retail store employee because OS X doesn't support decades old software is pretty hilarious.


Good job for getting upset a retail clerk making not much more than minimum wage for CEO-level decisions?


And were you seriously running binaries from 1987 (+/-) because you might as well stuff that thing in a dosbox.


Progress is hard.


And not a single thing of value was lost.


I admire the developer, but the whole post feels like a cop-out blaming Apple. Too much competition? Let's give up. Engine broke from some change, don't want to try fixing it? Let's give up.

Ads cost too much? Not, 'let's stop advertising on iOS', instead, Let's give up.

I can understand as much as the next iOS developer that changes that require large workarounds due to software/hardware updates that aren't backwards compatible are a pain. But as a developer, you suck it up, modify your content, and continue (Edit: Of course, if it makes financial sense, positive ROI, etc, to keep selling on the Appstore).

Why do developers feel entitled to hardware devices that will always work with legacy code? Apple makes changes that try to push their technology forward to the latest and greatest, all the while removing legacy cruft at the expense of sometimes trading off backwards compatibility. It makes sense that they expect their developers to keep up with the latest trends and produce content that's formatted to the current specs, not simply relying on old tech specs.


> Why do developers feel entitled to hardware devices that will always work with legacy code?

"Entitled" isn't really the word I would use there. It's not like developers throughout history have forced poor platform vendors to provide backwards compatibility at gunpoint. It's that most platform developers have provided backwards compatibility because doing so helps them attract developers. This has been true literally for fifty years -- see https://en.wikipedia.org/wiki/IBM_System/360 for the first platform vendor to figure it out. Backwards compatibility is part of the work you do to convince people to use your platform over the alternatives. Omitting it can make developing your platform easier, but that ease comes at the cost of forcing you to work harder to convince people to get on board.

Apple's within their rights to decide they don't need to work hard to attract developers anymore, of course, but developers are every bit as much within their rights to decide to take their products elsewhere.

Whether it's smart business for Apple to decide they don't need to work hard to attract developers is another matter, of course. History suggests that is an attitude that eventually comes back to bite you. (Ask anyone at Microsoft, they can tell you alllllll about it.)


Why Microsoft as an example? They are pretty nuts about backward compatibility, including fixing other people bugs so their software would continue to run (the old SimCity bug). I believe I can still run Doom and, say, Ventura Publisher 3.0 on vanilla Windows 8 (although I'd prefer DOSBox, of course, at least for games).


It's true, MS cares more about back-compat than nearly anybody else. I was thinking there more of the "they'll use our platform whether they like it or not, because where else are they gonna go?" attitude that led to debacles like the Long IE6 Winter.


It makes sense to get out when it's not profitable. In this case, it's a small segment of their sales. They said so directly in their explanation. If there were no sales on iOS, would that be giving up, or just accepting it's not a profitable segment for your business?

> Why do developers feel entitled to hardware devices that will always work with legacy code?

First, I'm not sure where you are getting that sentiment from. They simply said they aren't going to support iOS devices, and listed why. They didn't complain Apple was doing something wrong.

Second, how is this in any way related to the hardware? An OS software revision caused the latest problem for them.


I think I got the sentiment from 2. and 3. Both seemed like the developer was blaming Apple's changes and the forced efforts they set on the developer for stopping iOS production. In reality however, the real problems surrounding the app not being financially viable don't seem to be hinged on Apple's changes here.

Hardware was just another instance from my own experience where Apple makes it a tinge more difficult with new device sizes/specs and so-on every year or so. This wasn't the case in this article, but it's something that Apple developers face, but come to think of it I guess this doesn't really apply solely to Apple.


It'd be enlightening to this conversation to know what caused their engine to break, before we assign blame.


> But as a developer, you suck it up, modify your content, and continue.

Or decide life's too short for this shit and make your money doing something more fun.


> Why do developers feel entitled to hardware devices that will always work with legacy code?

Because that's how the world outside Apple bubble works. You can still run older Windows software, you can still run older Linux software, you can still run older Android software.

Not forcing people to pay for new electronics (especially when it can represents their several months of earnings) just to keep their existing software running and services working is a GOOD thing.


I'm going to play devil's advocate and say that perhaps it's not a GOOD thing necessarily.

Maybe keeping existing software/services running on legacy mode isn't really GOOD... perhaps Apple indirectly forcing developers to keep their apps updated is an even better situation for the end user?

That's not to say we should drop backwards-support with every revision. But I think that there's plenty of occasions where it's acceptable to expect up-to-date software running on up-to-date devices being a norm.


I didn't really get that from the post myself - I read it more as just "the amount of effort we'd need to put in to get it working again just isn't worth it given the sales we have on the platform". It really just sounds like they don't have the time/resources to put into the port.

Which is fair enough. Still a shame though given how highly regarded the Spiderweb games are.


I disagree. Life and business are always about ROI and that is a personal gauge. The developer is stating the reasons they feel the ROI has tipped not in their favour anymore. I think it's perfectly valid if disappointing from the consumer's point of view.


And it makes sense to get out if the changes to the ecosystem cost you more (money/time/effort) than the income you get from it. No reason to "suck it up" if you have less painful alternatives.

Or:

Why do hardware companies feel so entitled to force developers to change working things for free?


"But as a developer, you suck it up, modify your content, and continue."

No, you're supposed to collectively bitch, moan, and whine, and if they still don't listen, you leave, taking as many people with you as possible.

Yours is the wrong attitude. We can do so much better. Developers shouldn't gather around platforms that are constantly shifting...


The entire software industry relies on changes in hardware and breaking changes in operating systems and libraries to create opportunity. Without the constant drumbeat of change, there's no need for new anything.


If you think about this as a small scale evolution it makes sense. On the other hand, I understand why the guy is unhappy. Apple os breaking compatibility for no good reason, mostly for promoting their new shiny things (like Swift), which does not bring that much to the table to justify breaking compatibility. Somebody pointed out in this thread that it is possible to roll with new platform that is compatible with the previous one and having new features at the same time. To summarize: platforms supposed to be stable, and backward compatible, only break compatibility when it is absolutely a must. Investing into a new stack is not cheap. I feel sorry for those who are at the mercy of Apple in this regard.


> Competition on the App Store has risen to a frenzied level. As a result, sales for our games has dropped massively and the cost of advertising them has shot through the roof.

So... Actually make good games? I don't know what the games of this studio are, but if you can't shine anymore in an app store just because there are more competitors, maybe your products weren't that good to begin with?


Unfortunately the definition of a "good" game and a game that earns money on App stores is slightly disconnected at this time. Skinner boxes that rely on practically scummy ways of selling expensive IAPs for basic amount of gameplay are the ones that dominate the earnings.


Which sucks because some of the games are quite good, or could be without the IAP stuff. I do enjoy Peggle and would be willing to pay a few bucks for it. Instead the only option I have is to get it for free and only play a few rounds per day until it's begging me for money. And not a one-time purchase, literally 99c per day. No thanks. And an Age of Empires style game would be awesome on mobile! But not a "it takes five hours to build this building unless you pay lots of money" style game.

It would be nice to have a platform that outlawed that kind of stuff, so real games for real prices could go back to being competitive. It's to the point where I won't even download a game that they won't let me pay for upfront (and even then, I find some $1.99 games that have stupid IAP requirements, and that pisses me off even more).


http://en.wikipedia.org/wiki/Spiderweb_Software

Seems they've gotten pretty good reviews from critics and fans alike. (I personally enjoyed Avadon, which came with one of the Humble Android Bundles.)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: