Hacker News new | past | comments | ask | show | jobs | submit login

There are lots of cases where people use e.g. ROS on robots and Python to do inferences, which basically converts a ROS binary image message data into a Python list of bytes (ugh), then convert that into numpy (ugh), and then feed that into TensorFlow to do inferences. This pipeline is extremely sub-optimal, but it's what most people probably do.

All because nobody has really provided off the shelf usable deployment libraries. That Bazel stuff if you want to use the C++ API? Big nope. Way too cumbersome. You're trying to move from Python to C++ and they want you to install ... Java? WTF?

Also, some of the best neural net research out there has you run "./run_inference.sh" or some other abomination of a Jupyter notebook instead of an installable, deployable library. To their credit, good neural net engineers aren't expected to be good software engineers, but I'm just pointing out that there's a big gap between good neural nets and deployable neural nets.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: