Hacker News new | past | comments | ask | show | jobs | submit | theogravity's comments login


Just deleted my data. Who knows who will own it after this?


their own privacy team told me they are bound by regulatory obligations to retain data even after you request deletion. I've notified our attorney general's office to see if anything can be done but it might be too late. I'd love for someone who knows these "regulatory obligations" to chime in.....


Interesting, as that would be a violation of the CCPA/CPRA for customers who are California residents, at least.

The "regulatory obligations" sounds like a fake excuse. Is 23andme even regulated beyond how your regular average business might be?


Federal regulations trump state codes.


Yes, they have been regulated by the FDA for a decade now.


>their own privacy team told me they are bound by regulatory obligations to retain data even after you request deletion

They have to retain data about the person who requested the deletion which seems eminently reasonable. In the future if you sue them because you can't access your account that you paid for, they have a record that you requested said account's deletion.

Similarly they obviously can't withdraw your data from the anonymized research projects they pursued.


Well, purely going on who it is of value to, probably life insurance providers or pharmaceutical companies.


Still waiting for my "deletion confirmation e-mail". Hopefully it arrives.


I've had situations where Cursor just starts to do some really bizarre behavior after long running cycles of tasks unsuccessfully like the death loop I've seen described in other threads.

Best way to deal with this is to just clear the embedding index from the cursor settings and rebuild it.

I've never had it go to a point where it will want to rf home, but now I'm a bit fearful that one day it will go and do it as I have it on auto run currently.


Is there a guide for how to use uv if you're a JS dev coming from pnpm?

I just want to create a monorepo with python that's purely for making libraries (no server / apps).

And is it normal to have a venv for each library package you're building in a uv monorepo?


If the libraries are meant to be used together, you can get away with one venv. If they should be decoupled, then one venv per lib is better.

There is not much to know:

- uv python install <version> if you want a particular version of python to be installed

- uv init --vcs none [--python <version>] in each directory to initialize the python project

- uv add [--dev] to add libraries to your venv

- uv run <cmd> when you want to run a command in the venv

That's it, really. Any bonus can be learned later.


There's also workspaces (https://docs.astral.sh/uv/concepts/projects/workspaces/) if you have common deps and it's possible to act on a specific member of the workspace as well.


That's one of the bonus I was thinking about. It's nice if you have a subset of deps you want to share, or if one dep is actually part of the monorepo, but it does require more to know.


Thanks. Why is the notion of run and tool separate? Coming from JS, we have the package.json#scripts field and everything executes via a `pnpm run <script name>` command.


Tool ?

Maybe you mean uv tool install ?

In that case it's something you don't need right now, uv tool is useful, but it's a bonus. It's to install 3rd party utilities outside of the project.

There is no equivalent to script yets, althought they are adding it as we speak.

uv run exec any command in the context of the venv (which is like a node_modules), you don't need to declare them prior to calling them.

e.g: uv run python will start the python shell.


I was looking at https://docs.astral.sh/uv/concepts/tools/#the-uv-tool-interf...

Edit: I get it now. It's like npm's `npx` command.


uvx is the npx equivalent, it's provided with ux, and also has some nice bonuses.


uv sync if you clone a github repo


uv run in the freshly cloned repo will create the venv and install all deps automatically.

You can even use --extra and --group with uv run like with uv sync. But in a monorepo, those are rare to use.


Thanks for the info.

I looked at the group documentation, but it's not clear to me why I would want to use it, or where I would use it:

https://docs.astral.sh/uv/concepts/projects/layout/#default-...

(I'm a JS dev who has to write a set of python packages in a monorepo.)


sync is something you would rarely use, it's most useful for scripting.

uv run is the bread and meat of uv, it will run any command you need in the project, and ensure it will work by synching all deps and making sure your command can import stuff and call python.

In fact, if you run a python script, you should do uv run python the_script.py,

It's so common uv run the_script.py will work as a shortcut.

I will write a series of article on uv on bitecode.dev.

I will write it so that it work for non python devs as well.


Did you mean group and not sync?

Really looking forward to the articles!


Sorry i misread and stayed on sync. Group and extras are for lib makers to create sets of optionals dependenacies. Groups are private ones for maintainers, extras are oublic one for users.


Agreed, if you don't know what Datadog is then you're probably not the target audience for this product.


Do you think if I don't know what datadog is, I am not the target audience for datadog?


Kinda? There aren't that many players in this niche and datadog is the "dog".


probably


Hi, I'm the author of LogLayer (https://loglayer.dev) for Typescript, which has integration with DataDog and competitors. Sift looks easy to integrate with since you have a TS library and the API is straightforward.

Would you like me to create a transport for it (I'm not implying I'd be charging to do this; it'd be free)?

The benefit of LogLayer is that they'd just use the loglayer library to make their log calls and it ships it to whatever transports they have defined for it. Better than having them manage two separate loggers (eg Sift and Pino for example) or write their own wrapper.


Hey, loglayer looks super cool! Would love to chat and set something up, send us an email at founders@runsift.com


Sent an e-mail!


killdozer? although "No one else was injured or killed,[1] in part due to timely evacuation orders"

https://en.wikipedia.org/wiki/Marvin_Heemeyer



Same, I have occupational lenses that are also focused to arms length, and it has made a huge difference for me as well when using it for reading things on my computer screens. It makes reading small text easier and feels crisp.

Using it outside of its intended distance will cause eye strain since your eyes won't be able to focus properly.

My provider calls them "computer glasses". It does not have blue light filtering as I do work with implementing web designs and color accuracy does matter to me.

I totally recommend computer glasses for anyone who works all day looking at a computer screen.

They would be a separate prescription / lens type (as in not progressive I think) compared to daily use glasses. I do have to swap to my daily use when not using my computer glasses outside of sitting and looking at a monitor.

Using my daily use for computer monitor reading doesn't feel "right" compared to my computer glasses. There is a clear difference between them.


>Using it outside of its intended distance will cause eye strain since your eyes won't be able to focus properly.

Mine are more useful that I anticipated when I'm not using them for work. I would advise against anybody driving with the wrong pair of glasses, but I can see significantly better with my occupational lenses than without. I would not trust them at night, but during the day I can see well enough I am not concerned about my driving. I don't intend to drive with them, but there has been the occasion here or there when I had to run somewhere quickly and forgot to swap my glasses.

It also helps that mine are progressives, so the very very top part of the lens is my "regular" prescription. I can use that to focus on something at a distance if necessary.

>They would be a separate prescription / lens type (as in not progressive I think) compared to daily use glasses. I do have to swap to my daily use when not using my computer glasses outside of sitting and looking at a monitor.

Like I mentioned above, mine are both occupational and progressive. I'd like to try non-progressive occupational lenses to see if I like them better, but I'm not convinced it would be worth the money.


Same. I've driven short distances sometimes to pick up lunch or something 5-10 minutes away because I forgot to switch my glasses. It wasn't ideal but perfectly doable.

I've only done it a handful of times, though. And also I wouldn't do so at night.


> Using it outside of its intended distance will cause eye strain since your eyes won't be able to focus properly.

I don't find that at all, personally. I wear my computer glasses almost all the time in the house and just let myself not try to focus on things. If anything it seems to be better than my normal distance lenses for eye strain, for me, because my eyes do try to focus with my normal lenses since it's supposed to be perfectly clear, where I know there's a good reason they're not in focus when I'm not wearing them.

My distance glasses have progressive lenses, which may be part of that, as there's different strength depending on where you're looking at in the glasses. I've been tempted to remove progressive lenses from my next pair, as I tend to take them off to read anyway, and then I'd get a flat prescription like I have on my computer lenses.


Me too. My progressive lenses give me eye strain and it is much worse at the computer. I have non-progressive lens for work and they’re much more comfortable. (Especially with my large monitor.)


Would love to speak with you for 20 mins to learn from your experience. If interested, ping me at jbornhorst [at] gmail [dot] com and I'll coordinate times.


The CSRF token is usually stored in a cookie. I guess one could try stealing the cookie assuming the CSRF token hasn't been consumed.

But if one's cookie happens to be stolen it can be assumed they already have access to your session in general anyways making CSRF moot.


This might be useful for just checking the general content of a chapter you're interested in if it hasn't been translated yet, but it's not clear if it handles things like varying fonts / sizes used to convey the emotion of spoken dialog, does consistent translation (eg does it remember stylistic choices it has made chapters before it), or handles tricky items that might be difficult to localize.

Also, how does it work, what is the tech behind it? Are you doing any of the training yourself?


Another thing I'm not sure machine translation can really "nail" is cultural context, or even little linguistic cues and other tidbits. I like when translators explain in the margins that one character is speaking in a certain register for XYZ reason, or that there's been a shift in a certain relationship signaled by a change in how they address each other, etc.

That said, I did just read a great series last night whose human translation ended right before the final two volumes, and hasn't been updated in nearly 8 years... so I may need to try some machine translation on those last two volumes just to see how things end.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: