Hacker News new | past | comments | ask | show | jobs | submit login

Division takes a data-dependent number of cycles on all x86-64 processors.



But the dividing instruction is still O(1).


To decode? Why does the user care how long it takes to decode, they care how long it takes to run. We can provide upper bounds for exp, sin, and erfc too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: