Hacker News new | past | comments | ask | show | jobs | submit login
A little-known iPhone feature that lets blind people see with their fingers (yahoo.com)
304 points by Tomte on March 11, 2017 | hide | past | favorite | 107 comments



My daughter who has Fragile X Syndrome, has used an iPod touch for years now. She's 14, and reads at about a grade one level. Between the screen reader that will read articles, texts and almost anything else on the screen, and the predictive typing and dictation, the iPod is an incredible device for her.

She has an Android phone, but it's not even close in terms of the accessibility options provided. Apple is excellent in this regard.


There's an iOS accessibility engineer who's blind [1]. It's amazing how dedicated they are on accessibility features

[1] http://mashable.com/2016/07/10/apple-innovation-blind-engine...


There's also a blind Android accessibility engineer at Google.

The differences probably come more from the splintered android landscape and Apple's stricter review process.

Source: met him


I don't think the review process matters because their plenty of apps that don't support voice over at all on ios. I think it comes down to apple supporting it much earlier and advocating for it. More blind people started using iOS and then support for apps started getting better.


A major difference (that the review process could influence) is that iOS apps seem to be pushed toward using native controls whenever possible (or abstraction libraries atop those controls), where Android tends to have a lot of custom controls. The native OS controls have very good accessibility, so the average iOS app (which leans heavily on native controls) should be more accessible than the average Android app (which doesn't.)


> She has an Android phone, but it's not even close in terms of the accessibility options provided.

Anything except the price that made you/her go with an Android as phone?


Just price. $100 for the phone (Moto E) makes it less painful if she loses it. The cost difference between the iPod and the iPhone is a bit steep so the Android phone provides emergency connectivity on the go.


I had a user once request that I add VoiceOver support to my app. I was surprised by how easy and simple it was to add it. If I recall correctly, it was just a few fields you have to fill out on your UI components. So once I learned how to use them, I always fill out to fields on future projects.


It's too bad you do actually have to fill them out to activate the functionality, rather than there being some intelligent default enabling accessibility (if clumsy) from the first time you drop a control in Interface Builder.


When I see stories like this one, read here how good the accessibility is in iPhones, and read about how good their security is for end users, I don't understand why it's trendy to tear down Apple and Tim Cook.

What a great citizen of the IT community and the world. It would be a great loss if new management came in and decided that all of this wasn't profitable.


Apples response to Android was lawsuits over ridiculous things such as "having round corners" (and thus violating Apples design patent). They make a great product, but they're an absolutely terrible corporate citizen.


I don't understand why it's trendy to tear down Apple and Tim Cook.

They made their decision on what their externally-facing corporate culture would be. It's no one's fault but their own, that people hold them to a higher standard than they may hold others.

What a great citizen of the IT community and the world.

They're leading the day-to-day normalization of the idea that end users don't get (and shouldn't have) ultimate control over their devices. Calling them a "great citizen" is an incredibly frustrating thing to read.


They're leading the day-to-day normalization of the idea that end users don't get (and shouldn't have) ultimate control over their devices.

Or said another way, the vast majority of users don't care to have complete control over their devices and place more value on Apple's approach.


At least they're not handing this control to the NSA/CIA. Apple's also made a few important steps towards keeping data encrypted and on device, instead of being sprayed across the internet(the Mac's new photos.app is a great example of this).


Right. 'Control' can mean a lot of things, depending on what it is you actually want a thing to do for you. Us nerds put a disproportionately high value on things that are (for example) data 'open', but somebody else - such as the interviewee in this article - may have different needs.

I think many people feel 'empowered' using an iPhone just the way it is. It allows them control over their lives, not the technology.

Control is just the ability to direct a thing to behave in line with your intentions. I think apple understands that more than others, even if that's not always in line with what us nerds want it to mean.


Also I feel more in control of my Mac than my PC now Microsoft are putting spyware like stuff in the updates.


> They're leading the day-to-day normalization of the idea that end users don't get (and shouldn't have) ultimate control over their devices.

In the video it looks like they've given a blind person ultimate control over their device.

(I know what you meant by "ultimate control," but don't think it's a very human thing in your context.)


I find VoiceOver handy while driving. I'm never doing anything complicated, but it's an awesome complement to Siri for getting things done with minimal distractions.

  - Easy to turn on/off with triple tap home button shortcut
  - Doesn't require looking at the screen at all
  - Double, two finger tap to play/pause podcast player or music
  - Automatically reads incoming notifications
I highly discourage trying to learn to use VoiceOver while driving. But if you're reasonably familiar with it (say, from testing iOS apps for basic compatibility), it can be useful.


Interesting. I used to use VoiceOver when doing platelet apheresis (lying down in a bed for an hour and a half) and I found that a lot of stuff I wanted to do other than play/pause needed two hands to perform reliably, like making the rotor gestures. I'm having trouble imagining what's easier and safer to do with VoiceOver while driving compared to glancing at the phone for a tenth of a second and then positioning your finger later based on what's on the screen.


VoiceOver is just better designed. I have a blind friend who never figured out how to use an Android device because of the radial menus. I had to explain to him exactly how to use them, because the way Android teaches it to users is by SHOWING A DIAGRAM ON THE SCREEN. Way to go, Google.


If you're interested in learning more about VoiceOver, AppleVis is a great community resource for learning more about it: http://www.applevis.com

If you want to see more of it in action, you can check out the Apple Design awards, where VoiceOver engineers demo apps:

https://developer.apple.com/videos/play/wwdc2015/103/ (36:30, disclaimer: I work on this app)

https://developer.apple.com/videos/play/wwdc2016/103/ (55:45)


[flagged]


The heavy downvoting you got last time you mentioned this was a signal.

Misusing a word is literally the least interesting thing about conradev's post.


I'm sad for you that trawling someone's comment history looking for downvotes is meaningful to you. Maybe it's a signal.

For the record though, since it seems important to you - there was no "heavy downvoting". Even if there had been, please consider that you don't speak for HN, and couldn't possibly know the motivation.


Your comment "here we go again" prompted the search. It's a simple search: [author:mbrookes disclaimer] - hardly trawling.

You're right, I don't speak for HN. But note that your comment here got flag-killed, and your previous comment got downvotes. Some people want to know why their comments get downvotes and flags. I'm telling you: low value English usage nitpicks get downvotes.


This is only tangentially related, but VizWiz[1] is an app developed by researchers to help blind people "see" objects around them.

The user takes a picture of the object with their phone and asks a query, and it is uploaded to a crowdsourcing platform (Mechanical Turk, I think) where workers answer the query, e.g. "What are the ingredients in this can of food?"

[1] http://www.vizwiz.org/


There is also TapTapSee[1], which gives you a description of a picture, Bespecular[2] which sends pictures plus a question to volunteers, and Be My Eyes[3] which connects you to a volunteer by video call. My wife Vicky did an interview with the BBC about be my Eyes.[4]

[1]http://taptapseeapp.com/ [2] https://www.bespecular.com/ [3] http://www.bemyeyes.org/ [4] http://www.bbc.co.uk/news/magazine-39056979


That's great.

"combines automatic image processing, anonymous web workers, and members of the user's social network in order to collect fast and accurate answers to their questions."


A friend is blind. The iPhone+voiceover is his primary means of conducting business. We've often talked about UX. Here's what I've learned:

Voiceover is currently an extra layer over a visual interface. So, it amplifies the gesture cost of a visual+touch UI.

After writing a NLP interface for a productivity app, I learned that most of the verbal interaction took less time the visual alternative. It didn't matter if you were sighted or blind. Fitt's law meant that navigating a visual+touch UI was too slow.

So, I'm now working on a verbal+touch UI. In that regard, voiceover users are power users.


Interesting. I have the same conclusion and working on my idea. Which app are you working on?


The achiles heel for voice-driven UI though is there is no silent mode. Don't you therefor have to create both a GUI and a VUI?


Yes, I noticed my friend using voiceover in situations that would normally warrant silent operation. White cane allows him some slack. Am attempting a universal product. So, a two part workaround:

1) use voiceover when headphones are connected

2) use tapic engine for clues about marked events

This year, pervasive Bluetooth headsets make 1) a lot more viable.

Because sighted users may use headsets a bit less, have also included graphic cues.

"VUI" - first time I've heard of a VUI. Like it.


I'm tired of graphics in small screen. I believe millions would agree with me. So no GUI is not a big deal.


Hmm... perhaps a braille terminal with two-four characters on the bottom of a wrist watch?


Br

ai

ll

e

is

re

ad

li

ke

En

gl

is

h:

by

th

e

sh

ap

e

of

a

wo

rd

or

se

t

ph

ra

se

.


I'm developing a productivity app on an Apple WatchOS3, using STT, TTS, and taptic engine. Turns out that applying the same limited interaction area on the iPhone eliminates the need for "reachability".

How about you?


We developed Vorail for bind people. it turns out they really like it. I've finished a prototype on Android to examine the idea of "Sound Virtual Reality". Apart fro the fancy name, it plays multiple streams of audio on different direction (left, front, and right), and allows users to swipe on screens to "move" around.


I have mixed feelings about this. On one side, I'm very happy for blind people being enabled by software and hardware.

On the other side, I'm concerned that people have to use proprietary software to enable their sight. I doubt that there is open source software for blind people of similar quality. Am I wrong? I hope there is such software.

What would RMS do if he had a choice between using a mobile phone with proprietary software to enable sight and remaining blind?


If I could add my two cents here, I am quadriplegic and if I want to use a modern mobile phone and laptop or desktop I have to use Apple products. Their accessibility software is second to none and no other company comes close, I can take a new device out-of-the-box and have an able-bodied person set it up for me in about five minutes and from then on I need no other help.

Everything an able-bodied person can do with an iPhone I can do with my chin, usually with Microsoft and android and other open source offerings the accessibility software is always some subset of options and not the full experience. With Apple I get to use every aspect of every device despite the fact I can't move my arms and legs.

I would absolutely love to switch to Linux as my daily operating system, but I can't because the development has been done and it just isn't possible for me to use them as they are. So stuck with Apple I am. If Linux had the kind of support Apple did, I would switch in a heartbeat but they don't unfortunately.

They really are world leading in providing accessibility software for those of us with profound disabilities, and I've spent a decade looking.

Full disclosure: I help beta test the last couple of versions of the accessibility software for iOS.


Preface: I'm able-bodied, have reasonable sight, etc. In other words, I'm commenting on things I'm certainly not qualified to.

As someone who's used Linux for many, many, many years... I want to find fault with your conclusion that Linux and Open Source in general can't do this, but everything I've seen suggests I'm not very likely to be able to.

Even for able-bodied people, Linux isn't all that accessible - Some examples include poor font rendering (Yes, chrome renders nicely, but it's nowhere near consistent across all the applications I use day to day), another example is the fractured UI landscape, where apps tend to either be KDE, Gnome or Java, and each platforms idiosyncrasies leaks in leaving things like iconography, menu placement and organisation, and general UI styles different between each application.

I can't help but think, until we can sort this style of issue out, we have no chance of sorting out true accessibility. I'm reminded of XKCD's competing standards comic, where the most likely outcome here is probably yet another competing "standard".


I think it's just that Apple in this case has complete control over the ecosystem so they can mandate screen reader support(there's probably a Linux utility out there which only supports GTK), has a culture of perfection(there's no way that these features provide a good RoI), and is able to charge a high enough premium to subsidise these things.

I read Free Software: Free Society and I was amazed at how RMS's economic plan was basically 1) unis fund early development, 2) developers flesh out the ecosystem for free, 3) users start flooding in and, 4) these users magically start paying(or the hardware manufacturers do).


What this model misses is that it takes a lot more investment to make great software than just some developer time. It takes designers, artists, testers and depending on the software experts in specific fields. That's a huge investment in resources completely outside the software development field itself. Universities by themselves just don't have the research and investment capacity to carry advances in every single domain software can be applied to. They have other things to do too.

Back in the 60s it might have looked like the majority of effort to develop software just took some time by developers, but nowadays in many areas the developer time is a fraction of the overall effort required. It takes huge investment and if there is one thing Capitalism is good at, it's the productive and efficient deployment of capital. It's right there in the name.


Apple has control over the hardware, the operating system and their own apps but not over the entire ecosystem. This gets them far but only so far.

The big difference they've made is that they've created tools to make integration with Voice Over so easy, that it's a no brainer to do what little work needs to be done to make an app accessible and to increase demand for it.

They've managed to turn a burden into something that has an obvious economic benefit for app developers.

Also I'm fairly sure that there is actually a good RoI. It might not be massive but in the long term having the monopoly on phones for blind people surely pays off.


> I think it's just that Apple in this case has complete control over the ecosystem so they can mandate screen reader support

I don't believe they mandate it (how would that work for games?), but Apple provides excellent tooling and support throughout the system and it's built into all native controls so adding it to an application requires relatively little effort. See joshaidan's comment above: https://news.ycombinator.com/item?id=13846160


First, I don't think that "enabling sight" is the right term to use here. It's just making a certain device usable to the blind.

And yes, there are open source and free options available. iOS' accessibility APIs are closed to third parties and VoiceOver is the only option on that platform. However, there is free and open source screen reading software for Windows, Linux and Android.

I can't say what the state of Orca, the free Linux screen reader, is these days, but I know development is still going on. Orca wasn't so much of a problem in the past when I tried it, it was more that it could be a real pain to get everything working together. Think reasonable low-latency sound output for speech, driving a braille display through the BRLTTY software, getting the screenreader runnign at the login screen etc etc. I hope that has improved by now, but I only interact with Linux through SSH sessions or local text console these days.

Then there is NVDA for Windows. A free and open source screenreader mainly developed by two blind guys. On many fronts it has feature parity with the very expensive commercial offerings and even surpasses the commercial offerings on certain points. I use it as my daily driver.

In the past I also used a Mac near fulltime, but the VoiceOver of Mac OS became to buggy for my professional work. Also, usually updates only came when the OS was updated, so fixes and new features could take a while. So, long story short, open screenreader on a closed operating system that provides stable APIs seems to be the best of both worlds for now.


RMS is an extremist. I don't think it's reasonable to be upset that there isn't high-quality free software for every possible niche.


RMS is more of a realist than people think. He doesn't believe in owning phones, but he'll gladly use someone else's who has decided to make that tradeoff. He's well-aware of the fact that the proprietary regime leads to more software in more niches. He chooses to stand as a beacon of light to those who want a better world.

He considers his efforts to have been wildly successful.


> He doesn't believe in owning phones, but he'll gladly use someone else's who has decided to make that tradeoff.

Makes sense. I don't eat meat, but I'll eat the meat of someone else who's decided to make that tradeoff.


I don't know why people always assume other's positions boil down to illogic.

People don't eat meat for many reasons. Those reasons may or may not include a moral component. If it does include a moral component, there are varying amounts of pragmatism you can add to it. I eat meat, but I'd happily switch over to vegetarianism if, say, I lived in India and the vegetarian food there is amazing, and everyone around me was also eating vegetarian.


If God had meant us not to eat people, he wouldn't have made us of meat.


"Oh, there's a brain all right. It's just that the brain is made out of meat! That's what I've been trying to tell you."


> RMS is more of a realist than people think.

> He considers his efforts to have been wildly successful.


Be the change you wish to see in the world :)

You could look at it as a problem, or as an opportunity.

I think it's excellent that someone (in this case Apple) is setting a high standard for others (libre or not) to follow.


"Be the change you wish to see in the world :)"

see, that assumes you have infinite time & money at your disposal. But most of us have jobs and other commitments which mean we can't just drop everything to spend years of our lives researching and building something


It assumes you actually want something to change and not just complain and be a Homer ("Can't someone else do it?").


What would RMS do if he had a choice between using a mobile phone with proprietary software to enable sight and remaining blind?

The same he does now - not use any mobile phone.


There is Talkback on Android, which sometimes gets a source release with a long delay at https://github.com/google/talkback It, and Android accessibility in general, is still far behind Apple. Unfortunately Linux desktop accessibility is also far less usable than Windows and MacOS. There is however an excelent open source screen reader for Windows called NVDA. I am using it right now. http://nvda-project.org/


People use all kinds of proprietary technologies to get help with disabilities, all the time.


If even Android doesn't come close... I think that answers the question.


Apple are investing resources far beyond the direct economic benefit into accessibility, out of moral concern (with a small PR benefit).

This is hard to replicate in the open source world where the model is "everyone contributes what they have an economic incentive to create".

On the other hand, regular command line tools are fairly accessible to blind users (I imagine, though I wouldn't like to read a man page in Braille, or by TTS), though of course the usual issues of inconsistencies between tools are magnified.


You might be interested in the results of the 2016 GOV.UK assistive technology survey. [0]

iOS VoiceOver ranked as one of the 3 most popular screen readers, second to JAWS for Windows [1] (closed source, paid) and above NVDA for Windows [2] (closed source, free, donations encouraged).

ChromeVox [3] is AFAICT the only free and open source screen reader that came up in the survey, at 1%.

- [0] https://accessibility.blog.gov.uk/2016/11/01/results-of-the-...

- [1] https://www.nvaccess.org

- [2] http://www.freedomscientific.com/Products/Blindness/JAWS

- [3] http://www.chromevox.com


NVDA is also free an open source. [1]

[1] https://github.com/nvaccess/nvda/blob/master/copying.txt


Alas, there is little open source software of the quality of the closed source software in any areas except OS and infrastructure.


RMS would presumably do the same thing he did many decades ago, start a project to do away with the false choice.


You are assuming that Apple is making these accessibility features for capitalist reasons rather than to comply with disability and accessibility legislation.


No assumption is needed, just look at the record. Apple has been putting accessibility technology into its phones and desktop machines for over 20 years, long before it was ever mandated.

Apple has guided developers into using this technology in apps they write. As an example, Apple's VoiceOver screen reader technology uses widgets that are tagged with information about what kind of control they are. Developers can use this same tagging information when writing automated tests for their apps, which is a great way to enable developers to add assistive technology to their apps without making it difficult.


If that's the case, why isn't anyone else reaching this level of functionality? It is clear they are going above and beyond any level required by regulation.


Some people will never miss an opportunity to bash Apple. They can't help it, it's a reflex, like breathing.


The iPhone is full of wonderful affordances like this for low-vision folks.

My favorite option I've discovered is the magnifier. When it's on, you can triple-tap the home button to open the camera and turn the iPhone into a little hand lens, even if it's locked. The shutter button doesn't save the picture; it just freezes it (with some fancy optical stabilization) so you can pinch and zoom even further with your fingers. This is more convenient than swiping around on the lock screen to open the ordinary camera.

It's really helpful to read signs and restaurant menus. Settings -> General -> Accessibility -> Magnifier.


I love that part: "And here’s the kicker: She could do all of this with the screen turned off. Her phone’s battery lasted forever."


My cousin keeps breaking her phone's screen. She's terrible with actually caring for her things, and because it obviously becomes very expensive to repair or replace all the time, she uses accessibility features to deal with it instead. She has no disabilities whatsoever, except arguably an inability to physically coordinate her actions such that she might avoid dropping her phone all the time...


I had to do this in my 4s when the power button broke. I had to use the accessibility power button...


Universal Design, universal access. Doesn't matter _why_ user can't see the screen, or can only use one hand, or can't hear the phone, or....


Yep, Screen Curtain. And you can set up a gesture to easily turn it off and on (useful if you have some vision or want to show someone else, I guess).

I'm not blind, but I have played with the iPhone accessibility features and it really is surprisingly easy to navigate around the phone and apps with it.

The haptics in the iPhone 7 also let you feel buttons, which is very cool and could be put to good use in accessibility.


There's a bit in the video that displays the guys emails, and then a list of phone numbers from his missed call list. Hopefully he gave consent for that to be shown on camera?

Anyway, it's really cool how well the VO stuff seems to work on Apple devices, including OSX. A lot of companies tend to just do the bare minimum to meet regulatory code.


Does anyone know what the story of accessibility is like for React Native apps? Hope it's not as sad as it generally is on the web. I imagine it's about as good as native apps, but curious if anyone knows specifics.


From my experience (as a dev) it's pretty good. You have to explicitly turn off accessibility features like dynamic type (larger fonts based on a global user setting) and VoiceOver works just fine with the default <Text> component. Since it's a native app React Native seems to have better support for these things than a WebView base like cordova.

However, I am not a disabled person, just a curious dev, so I'd definitely defer to someone with more experience using accessibility tools. Just saying it looks pretty good on the surface to me.


Feel free to point me in the direction of a react native app - I would be happy to test it with Voiceover and let you know how well it works.


Blind friend of mine uses it all the time. Has a navigation app that tells him the distance to certain waypoints, it's fascinating.

Allows him to do a lot of stuff that wasn't possible before.


There was a cool talk at HOPE 2016 about this:

https://www.youtube.com/watch?v=qoevABNU5DI&list=PLcajvRZA8E...


It would be cool to have a device that instead of pixels, had bumps that raise up or down. Point the phone's camera at a scene, and you can 'see' the scene with your finger.


It would be very useful, but the display tech isn't there.

The state of the art are still two lines of text in fairly big braille devices (for ~$5k-$10k). There is some very interesting work being done at the University of Michigan[1][2] on a new mechanism based on microfluidics, bit it remains to be seen if that mechanism will reach a form factor even approaching a phone.

1: https://www.engadget.com/2016/01/12/braille-tablet-display/ 2: http://www.engin.umich.edu/college/about/news/stories/2015/d...


Single line Braille displays are a fair bit cheaper though, for example this single line one at $2600 http://www.apple.com/shop/product/HJB42VC/A/humanware-braill...


I'll believe it when I see it but apparently there is a tablet using this technology. There isn't much info on there web site. http://blitab.com/


A few years ago Disney were working on simulating bumpmaps on touch-screens [1] using dynamic variation of friction achieved using electrovibration [2], but I've not heard anything about it recently. I wonder what happened to the research.

[1] https://www.disneyresearch.com/project/3d-touch-surfaces/

[2] http://www.olivierbau.com/teslatouch.php


This is a very nice feature. I soon found a bug in it, though. It wouldn't read PDF files in iBooks. PDF support always seems to be a sorry stepchild in every operating system.


Depending on the PDF, there may not be any text there to read. As long as there is, I've found it to work quite well in iBooks for iOS 9.3.5, and will sometimes have it read a chapter at a time to me while I'm doing hand chores or otherwise unable to read from the screen directly.


There is text there, it just won't read it. Ironically, it'll correctly draw boxes around the lines if you touch it.


Sometimes PDF files are just images of text from a scan. No actual text. Sometimes I try copy and pasting from PDFs and get random garbage. They could add OCR though.


The PDF is a scanned image with embedded OCR'd text. If you bring it up in Foxit or Acrobat, the text can be searched, selected, and in Foxit ^6 will switch to a text view.


I read PDFs a lot on my iPhone with Voiceover. It wil of course not read the kind of PDFs that just contain an image of the page. For that you need an OCR app like the KNFB reader mentioned in the video.


Works in PDF expert. The free version, Documents by Readdle might be useful for you.


Things like this make me simultaneously excited to be a dev and disappointed that I'm not working on something as life changing as this.


As a web developer, I wonder if this class of accessibility functionality renders the web accessibility guidelines [1] obsolete? Many of the guidelines have to do with organizing content and following rules that make it easier for older (inferior) screen readers to handle. But I just enabled this feature on my iPhone, and browsed some sites that I know are ridiculously noncompliant, and it handled them beautifully. Even guidelines having to do with describing photos and avoiding text in images could be rendered obsolete with some of the third-party apps that perform OCR and use AI describe photos.

[1] https://www.w3.org/WAI/intro/wcag


It definitely doesn't make the web accessibility guidelines obsolete. there's a lot more to web accessibility than just getting Voiceover to read the text under your finger. It is essential to have descriptions for images that are links, and appropriate use of headings makes finding the relevant information far easier.


Does anyone know of an HN client that works well with Voiceover?


I tried it out a while ago and it's really easy to use. I can definitely see how someone using it day in and day out are really speedy with it.


Idea: Haptic feedbac mimicking braille characters.


Sort of like... morse code?

Edit: That was sort of tongue-in-cheek, but I could see being able to run your finger around the screen and when you hit a 'dot' there's a haptic response.


I started a project exactly along these lines!

https://github.com/Hendekagon/MorseCoder

-- it's very basic


A running text braille on our fingertips maybe.


A lot of wearables have haptic feedback in them. Coupled with the phone could be quite powerful


I feel like Google Talkback can do basically the same with. You hover over the screen and Talkback tells you, what's underneath your finger. It can even notifications out loud. I would even go that far and say that it can do at least 90% of what VoiceOver can.

Yes, it might not work magically in every app, but with a good choice of apps, a blind person should be able to do most of the normal activities.

If I'm wrong, please, let me now.


So you've never used it, it probably doesn't work with every app, but it does a few of the same thing so it's probably the same?

That seems like it would be offensively dismissive to people who have to deal with this issue.

Apple puts a phenomenal amount of work into allowing people with all sorts of different handicaps and disabilities to be able to use their devices. I remember being in college before the iPhone was out. I had a blind classmate who had a special Nokia Symbian phone with thousands of dollars of software on it just so he could use it. All paid for by insurance, and it crashed all the time.

I don't think I've ever seen an article say a bad thing about Apple's assistive technologies. They are built-in, they don't cost extra, they basically work everywhere. Developers don't even have to put in that much work to make use of it. It's a core part of the system which means that when they decided to build watchOS it was already there and ready for people with disabilities. When they updated the Apple TV to use a version of iOS it got lots of the accessibility features as well.


Fair enough. I didn't want to sound offensive. I just wanted to hear someone else's opinion to how they differ and why one is better than the other. Completely neutral here.


This seems like a pretty good evaluation[0].

Basically accessibility seems to be a fundamental part of Apple's UI philosophy, while on most other platforms support for it is pretty uneven. I'm sure the TalkBack team work hard at it for example, but at Apple accessibility isn't just the responsibility of one team, every team that works on UI is responsible for accessibility.

[0]http://pauljadam.com/iosvsandroida11y/


Yes it does work, it's just seriously lacking in features and responsiveness. I use both Talkback and Voiceover and can get things done far faster with Voiceover.


Even the name "Talkback" sounds awfully derivative of "Voiceover" -- it follows the same <verb><preposition> pattern but just replaces the verb and preposition with approximate synonyms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: