Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
imtringued
5 months ago
|
parent
|
context
|
favorite
| on:
Towards 1-bit Machine Learning Models
Your hardware already supports addition and subtraction and the tensor cores of NVIDIA GPUs are already fast enough to keep up. The only benefit is reducing memory capacity and bandwidth requirements.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: