Hacker News new | past | comments | ask | show | jobs | submit | ben_e's comments login

With that kind of commute (assuming it's 1.5 hrs each way), OP is probably in the Vacaville/Fairfield/Davis/Sacramento area, where there are not actually a ton of great tech jobs. Some to be sure, but it's pretty different from SF.


OP says he's coming from SF. Must work in SJ or around 237.

> I personally used to commute 3 hours every day to get to work from SF


Menlo park


I live in MP, and it doesn't take nearly that long to get here from SF. Maybe Mountain View, on a bad traffic day. But nothing north of Palo Alto could take that long, IMO.


This is so cynical, yeah plenty of people have jobs that look like this, but if you look hard enough, there are tons of corners of the internet where people (communities!) are doing weird and fun things with code.


Education isn’t a zero sum game.


University admission is for those on the margin, since the positions are limited


Outer Worlds and Fallout come from two different developers.


Perhaps staff crossed over between the two companies?


This is the main reason I'm installing - I can build this into my editor pretty easily.


Previous discussion of the baseline results: https://news.ycombinator.com/item?id=11617906


I'm quite confused my Tensil, is the training of the model moved to their chips, or the final model? If the former, then are the chips locked in to whatever model they were built for, e.g. this chip only trains a multi-layer perceptron with n layers? Or are the chips re-programmable or FPGAs? If the latter, don't most models run quite quickly on CPU after training anyways?


It seems like they're selling ASICs. It would be useful in something like an embedded system where you train a model beforehand, deploy it, and then let it go. On top of speed improvements, they're generally more efficient, so something running on battery power could last longer.


Tensil founder here: we take fully trained models, training doesn't happen on our chips. The model architecture is hardened but the parameters can be reprogrammed.


I'm assuming it's the later. Some applications require pretty high speed inference, like video.


Another customer segment you could go after: I generated the star map from the day/time/location of my wedding, and themed the colors to match the wedding colors. My wife thought it was a great idea to print + frame it.


This post talks a lot about creating a community. Are they doing that anywhere in particular?


I am one of LFortran authors. We just open sourced it recently. Do you have any recommendations if we should create a mailinglist, or rather use something like Discourse? Or both.

Right now if you have any questions, you can open an issue at our GitLab repository:

https://gitlab.com/lfortran/lfortran


Or related, Air France Flight 447, where the best thing to do would have been no immediate pilot intervention[0]. [0] https://en.wikipedia.org/wiki/Air_France_Flight_447


No. The autopilot disengaged, the plane was in the storm, the sensors failed (iced).

The best was not even to enter that storm.


It disengaged, moved them to alternative law and one of them (the co-pilot, IIRC) didn't realize it and pilot and co-pilot gave different stick inputs. The purpose of a single button would be to put the plane in a known, easy to reason about state, even if it's not the most stable or the easier one to fly.


> The purpose of a single button would be to put the plane in a known, easy to reason about state, even if it's not the most stable or the easier one to fly.

The "known" is the problem there. The pilots there were continuously misinformed about the plane speed due to that iced measurement devices. That is what plane "knew" and what the pilots "knew" in the storm.

The autopilot handed over control to human, but then human drove it too high (the "law" here means "mode of operation"):

"The pilot continued making nose-up inputs. The trimmable horizontal stabilizer (THS) moved from three to 13 degrees nose-up in about one minute, and remained in that latter position until the end of the flight."

"A second consequence of the reconfiguration into alternate law was that stall protection no longer operated. Whereas in normal law, the aircraft's flight management computers would have acted to prevent such a high angle of attack, in alternate law this did not happen. (Indeed, the switch into alternate law occurred precisely because the computers, denied reliable speed data, were no longer able to provide such protection—nor many of the other functions expected of normal law).[55] The wings lost lift and the aircraft stalled"

> one of them (the co-pilot, IIRC) didn't realize it

But the co-pilot definitely knew that the autopilot disengaged:

"The first officer, co-pilot in right seat, 32-year-old Pierre-Cédric Bonin"

"At 02:10:05 UTC the autopilot disengaged" ... "As pilot flying, Bonin took control of the aircraft via the side stick priority button and said, "I have the controls.""


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: