Flagler Health | Remote | Software Engineer, DevOps, SRE, Data Engineer
We’re an AI/ML start-up revolutionizing the healthcare tech space by helping physicians provide better care for their patients in MSK pain management. We dive deep into clinic data to provide meaningful insights for our clients (clinics). We’re currently scaling out our core product and need someone to handle the infrastructure (web and data) of our live production system.
Half of our engineers are from HN! We are hiring for multiple positions:
data: Fullstack Engineer (Frontend focused), (Foward deployed) Backend Engineer.
See job post for continuously updated job descriptions:
https://jobs.ashbyhq.com/flaglerhealth
--
keywords: node, typescript, javascript, ts, js, vue, twilio, webrtc, FHIR, HL7
--
Drop a resume in engineering posting for general consideration.
—-
hot tip: put "HN" on "How did you find us"
Flagler Health | Remote | Software Engineer, DevOps, SRE, Data Engineer
We’re an AI/ML start-up revolutionizing the healthcare tech space by helping physicians provide better care for their patients in MSK pain management. We dive deep into clinic data to provide meaningful insights for our clients (clinics). We’re currently scaling out our core product and need someone to handle the infrastructure (web and data) of our live production system.
Half of our engineers are from HN! We are hiring for multiple positions:
While the post says that the openings are remote, on Ashby all the openings seem to be marked as On-site.
It would be great if you could clarify on this.
it's to cut down auto-apply applicants that don't read the job description and just click apply. We are at 10,000+ applicants like this, many of them applying to every single job posted with the same resume.
It sounds like a cop-out answer but we also did not specify # of years of experience. We consider all experience levels, so range would be meaningless (like those companies that post 100k - 900k).
This is exactly what I ask for in my interviews for backend engineers. I had good success filtering for backend engineers who view database as a part of the business logic rather than purely just a storage. The “answer” to the question has many variables with no right solution, so it’s also an opportunity to assess the candidate’s communication skills and how they tackle uncertainty.
We’re a seed stage AI/ML start-up revolutionizing the healthcare tech space by helping physicians provide better care for their patients in MSK pain management. We dive deep into clinic data to provide meaningful insights for our clients (clinics). We’re currently scaling out our core product and need someone to handle the infrasture and backened of our live production system.
Half of our engineers are from HN! We are hiring for the following positions:
We’re a seed stage AI/ML start-up revolutionizing the healthcare tech space by helping physicians provide better care for their patients in MSK pain management. We dive deep into clinic data to provide meaningful insights for our clients (clinics). We’re currently scaling out our core product and need someone to handle the infrasture and backened of our live production system.
Half of our engineers are from HN!
Technology:
Frontend & Backend stack: MongoDB, Node.js, Express.js, and Vue.js.
ML & Date Engineering stack: PySpark on Databricks
Infrastructure: AWS + YOU DECIDE!
We’re an early seed stage AI/ML start-up revolutionizing the healthcare tech space by helping physicians provide better care for their patients in MSK (pain). We dive deep into clinic data to provide meaningful insights for our end clients. We’re currently building out our core product and need someone to engineer the face of our product.
Technology: MVP Vue.js on frontend. We are open to rewrite to React. On backend, express, mongoose/mongodb. ML & Date Engineering stack: PySpark on Databricks
I think the most interesting part about OP’s story is the question about how did self-hosted solution notify Mattermost server about potential Russian/Belarus connection? Even if the compliance automation was faulty, it’s still interesting how Mattermost found out of a Russian connection at all. (I am assuming this compliance email wasn’t sent out to everyone/larger group of people by mistake, and the OP happened to have a Russian user)
Mattermost CEO here,
Thanks for the question. Like many companies we use a 3rd party service to check if someone we’re doing business with a company that has been flagged on export compliance.
HN has a lot of people building SaaS and open core companies, so hopefully this thread is a good way to learn about export compliance, which is something we've been doing for many years, though it's gotten extra important in 2022 due to so many new sanctions showing up.
Think of it this way (in a simplified, high level view that doesn't capture all the detail, but intended to share the aesthetic):
1. When you're an early stage company based in the U.S. starting to sell open core licenses or SaaS you typically hire a lawyer to do the legal agreements and help negotiate contracts.
2. If it's a good lawyer, they might talk about "export compliance" and how your company might need to think about doing an assessment on how your product is classified in the context of U.S. export compliance restrictions.
3. If they're a really good lawyer, they may even recommend an export compliance consultant for you to use.
4. After you get your export compliance classification, you're going to need a way to implement the right checks to ensure you're not violating U.S. export compliance laws based on your classification and your customers.
5. You quickly realize you need to buy a tool to do this--not only to check at the time of transaction, but also to alert you if the status of a customer changes (for example, if a customer is added to a list of organizations flagged by public sector organizations).
6. You look at different options, and end up purchasing one and integrating it with your other systems, including Salesforce (sales automation) and Marketo (email automation). In this case, we purchased a subscription to Descartes.
Hopefully that helps share context. Please feel free to ask other questions here.
Ian, you might want to clarify that the only thing submitted to the 3rd party service is the company name of the customer and there was no submission of any customer logs.
Some other commenters in this thread think that you log their ip and submit their ip.
It's a pretty reasonable conclusion when the vendor claims to know where you're using the software, and the evidence is that the vendor claims to know where you're using the software.
My impression from the OP is that the company does not claim to operate out of Russia or Belarus. Presumably, neither would the website. Clearly there's some other method by which that third party makes that determination, and clearly that method produces false positives.
We make enterprise Data Quality detection & monitoring tool. Our startup got recently acquired (https://bit.ly/39yGIwr) and formed a brand new Collibra Data Quality team to incorporate Data Quality product to Collibra's data governance & management suite. Be part of the chaotic but fun integration & expansion process in a bigger company (Collibra is a series F startup with plan for IPO soon), while still having the feel of a smaller startup. As part of the Data Quality Product team, you will be part of the engineering team that is driving key aspect of Collibra's growth strategy as the go-to enterprise solution for data management.
We are fully remote, even for post-COVID. On-site is available for NYC/Atlanta (if preferred). UK/EU employment is possible but US preferred (we are trying to grow US engineering team). The team itself is fully remote, so you won't be the sole remote team member.
We don't do leet code algorithm coding tests just because FAANG does. We don't care if you know how to make coin change using dynamic programming. You should be good at and interested in writing software that will actually be used and provide value to users. We have live production clients globally. Our software is used to process and analyze millions of real customer datasets using Spark on a daily basis.
We use mixture of Java Spring for API and DB (Postgres & CockroachDB) and Scala for Spark. Frontend is a bit messier where we are using jquery (being migrated to Angular (TypeScript) and/or React).
reply