Hacker News new | past | comments | ask | show | jobs | submit login

Considering that autistics and empaths tend to have remarkably rich gestural languages for communicating a wide array of non-verbal units of meaning, or even propositions, and further that allocentric language may present scoped indexicals (and not to mention Ame. Sign Language), we may have an opportunity to develop interestingly rich gestural/touch APIs from the mechanics of neurologically rooted gestures which describe or underpin norms of highly complex, spontaneously emergent non-verbal communication.

For instance, would finger-flipping or self-stimulation be considered "noise" to such a system, or would the system be configurable or adaptive or "fuzzy" enough to make successful interpretations of various deviant forms of model human behaviors? (I'm wondering the intersection between these types of interfaces and the training (or, say, auto-designing) of them via neural networks.)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: