Hacker Newsnew | past | comments | ask | show | jobs | submit | SimplyUnknown's commentslogin

I have the feeling that B-splines would be a good solution for this problem. Given that they have a continuous zeroth (i.e., the function is continuous), first, and second derivative, the motion will always be smooth and there will be no kinks. However, maybe it's moving the problem because now you must tune the coefficients of the B-spline instead of damping parameters (even though a direct mapping between these must exist but this mapping may not be trivial).


Multiple reasons, while technically better and more benign compression artifacts, it is computationally more expensive, limited quality improvements, encumbered by patents, poor Metadata format, poor colorspace support... In the end, the benefits aren't great enough compared to jpeg to change the default format


I really like einops. This works for numpy, pytorch and keras/tensorflow and has easy named transpose, repeat, and eimsum operations.


Same - I’ve been using einops and jaxtyping together pretty extensively recently and it helps a lot for reading/writing multidimensional array code. Also array_api_compat, the API coverage isn’t perfect but it’s pretty satisfying to write code that works for both PyTorch and numpy arrays

https://docs.kidger.site/jaxtyping/

https://data-apis.org/array-api-compat/


Full paper link for the interested: https://ehdijrb3629whdb.tiiny.site


"404 Sorry, this content doesn't exist."


In medical imaging, data are often acquired using anisotropic resolution. So a pixel (or voxel in 3D) can be an averaged signal sample originating from 2mm of tissue in one direction and 0.9mm in another direction.


and it’s displayed with a completely different algorithm…


Conda indeed is slow. However, mamba is a drop in replacement for Conda and uses a way faster solver, which makes it a lot more palatable.


Does it use a sat solver that has better average-case behavior, or does it sacrifice on full sat solvability?


Without fully solving it, it is impossible to install packages. This is my anecdote but I find Mamba better at solving tricky dependency requirements like certain version of Python and a certain version of Pytorch with Cuda support and a certain protobuf version.


Not quite what you are looking but if you're interested in Operation Market Garden: for the Dutch maps there is https://www.topotijdreis.nl, which gives you historical maps with a year slider. This can at least help one visualize how cities, villages, and topography at through the years.


There's also tools that wrap a part of toporijdreis and add other georeferenced historical maps! I recently saw one of those at https://geodienst.xyz/pastforward. Wish more people georeferenced historical maps, but it is tough.


CGP Grey also made an excellent video about it, which he dubbed the NaPoVoInterCo: https://www.youtube.com/watch?v=tUX-frlNBJY


But Chinese (or mandarin) is not a context-free grammar whereas I believe that encoding a language on a turing machine implies a context-free grammar so this example doesn't hold.


Well, a couple of points: its not obvious that Chinese doesn't have a context-free grammar: see the talk by David Branner: "The Grammar of Classical Chinese is Very Close to Being a Context-Free Grammar".

And a properly programmed turing machine can parse languages which are way more complex than context-free languages are.


I think maybe it's poorly phrased. As far as I can tell, their linear regression example for eq. 2 has an unique solution, but I think they state I that when optimizing for cosine similarity you can find non-unique solutions. But I haven't read in detail.

Then again, you could argue whether that is a problem when considering very high dimensional embeddings. Their conclusions seem to point in that direction but I would not agree on that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: