Is anyone here worried about pouring hot water into Aeropress plastic? I know it's polypropylene, which is BPA-Free but could there be things that we haven't properly researched yet?
If you want to avoid plastic, Hario Switch is a great alternative. It allows you to steep and release, kinda similar to AeroPress (pressure only comes from gravity though).
Cone is made of glass, stopper is from stainless steel, but the base is made of silicone. I imagine the hot water contact with the silicone will be a lot less compared to AeroPress.
If you get the size 03, you can more easily brew enough coffee for 2 people.
Heating plastics can release not just BPA but also phthalates, dioxins, and other harmful compounds which are linked to lots of nasty heath issues like hormone disruption, cancer and reproductive problems.
Polypropylene is generally considered one of the safer plastics with it's high heat tolerance meaning doesn't leach harmful chemicals. While pure polypropylene may be considered safe, additives used in the manufacturing such as colourants or fillers could have their own health risks.
Personally avoid mixing food or drinks with any heated plastics, including my coffee. I use French press for my coffee but if Aeropress is your thing and want to avoid plastic I've heard good things about the Cafelat Robot which is made almost entirely out of aluminium and stainless steel.
We all know .ipynb JSON format is not a great fit for Git. The Jupyter ecosystem has come a long way in the last few years. Solving this really comes down to a few tools -
- JupyterLab Git Extension[1] for local diffs (pre-commit diffs)
- nbdime[2] / nbdev[3] for resolving .ipynb git merge conflicts
- GitHub PR code reviews with ReviewNB[4]
- Alternatively, if you don't care about cell outputs then Jupytext[5] to sync .ipynb JSON to markdown
Disclaimer: I built ReviewNB. It's a completely bootstrapped business, 5 years in the making and now used by leading DS teams at Meta, AWS, NASA JPL, AirBnB, Lyft, Affirm, AMD, Microsoft & more[6] for Jupyter Notebook code reviews on GitHub / Bitbucket.
> Alternatively, if you don't care about cell outputs then Jupytext[5] to sync .ipynb JSON to markdown
Notice that using markdown is a possibility for jupytext, but not the only one. More interestingly, you can also store your notebooks as plain python files, whose comments are interpreted as the markdown cells of the notebook.
This is very useful, and not only for version control: if your notebooks are python files they can be executed easily in CI or by third parties just by launching the interpreter. No need even of the jupyterlab dependency.
With some care, you can craft a single python file "foo.py" that can be used at the same time as
1. an executable command-line program (that happens to be written in python)
2. an importable python module
3. a jupyter notebook (to open it you need the jupytext extension of jupyter)
4. the documentation with auto-generated figures, convertible to html or to pdf using "jupyter nbconvert --execute"
5. a regular .ipynb if for some reason you want to distribute the outputs in a re-executable format
For small simple projects, to showcase, describe and illustrate an independent algorithm, we have found this structure invaluable.
This is a post from my Linkedin page on my hopes for Jupyter notebooks and git. Anyone know of progress along this line?
#Jupyter notebook and git
As much as Jupyter Notebooks have been a great tool for data science, the transition to deployment, and the general software engineering friendliness of Jupyter Notebooks could use some work. From time to time, I have explored how others have dealt with turning notebooks into an organized codebase and outputs. To date, I have not found a comfortable approach for me. The ideal approach for me would be to use something like 'node metadata' in the way of [Leo Editor](https://leo-editor.github.io/leo-editor/) to function as 'decorators' for a notebook cell for integration with git.
By this I mean using something like special markers in Python comments (since much of data science is done with Python) to map the content of a cell (or output) to a git repository. Better yet, define a special cell type for git metadata preceding a code cell. Then implement some basic git operations on the contents of a cell. Let's suppose we use @@git as a marker for metadata in comments for git.
--- beginning of cell ---
# @@git %upstream%=https://github.com/pyro-ppl/pyro
# @@git %local%=~/repo/pyrodev
# @@git %branch%=burnburnburn
# @@git %file%=examples/cvae/util.py
# Here begins the contents of the util.py file
...
--- end of cell ---
An extension would implement items in the menubar for various git operations:
stage - stage the content as util.py file
checkout - checkout from upstream, replace local copy, and refresh content of cell
commit - commit stage file specified by %file%
status - ...
Imagined workflow is that once a working idea scattered throughout a notebook has been sketched out, the user would mark the notebook cells that should be mapped to files in a git repository. Also this could be used in a mixed dev/data science environment where library code under development can be pulled right into a notebook.
Yes, there will be problems with committing code with comments that are specific to one user which is why a special cell type makes sense. Yes, there will be problems that I can't even imagine right now but ...
Please message me if you know of a cell-based git extension for Jupyter Notebooks.
I'm just happy that Microsoft is finally charging directly for a developer product!
When they keep giving out freebies (VSCode, npm etc.), I never know which direction the product is going to evolve (e.g. unnecessarily tight integration with Azure).
With this, there's at least direct alignment between end user & the product.
Here are tools people commonly use for notebook version control with git -
[1] nbdime to view local diffs & merge changes
[2] jupytext for 2-way sync between notebook & markdown/scripts
[3] JupyterLab git extension for git clone / pull / push & see visual diffs
[4] Jupyerlab gitplus to create GitHub PRs from JupyterLab
[5] ReviewNB for reviewing & diff'ing notebook PRs / Commits on GitHub
Disclaimer: While I’m the author of last two (GitPlus & ReviewNB), I’ve represented the overall landscape in an unbiased way. I've been working on this specific problem for 3+ years & regularly talk to teams who use GitHub with notebooks.
The problems you mention are solved by auxiliary tools in the notebook ecosystem.
- Look at nbdime & ReviewNB for git diffs
- Checkout treon & nbdev for testing
- See jupytext for keeping .py & .ipynb in sync
I agree it's a bit of a pain to install & configure a bunch of auxiliary tools but once set up properly they do solve most of the issues in the Jupyter notebook workflow.
This is great. Other comments here are underestimating the power of a group. An average bounty on an issue might just be between $10-$100 but there can easily be 50 people putting up a bounty for a popular issue (have you seen issues with hundreads of upvotes and are still open for years). If we start seeing (collective) bounty on an issue in the range of a few thousand $$ then it's decent money for maintainers or anyone else who's motivated enough.
My guess is this is eventually going to be built into GitHub (via acquisition or otherwise).
Open source is a public good, which means open source bugfixing is likely to be undersupplied by private demand. I think this needs a kickstarter-like way for groups of people to "contribute only if others also do so".
Was just thinking the same thing on the “built into GitHub” thing. Being able to easily see the current bounty directly in GitHub and donate $10 to the bounty with one click I imagine would make adoption blow up.
Yeah and now extend this idea to start a bounty of 100$ for the maintainer of a library/framework to merge your pull request, completely new ways for open sources projects to be paid for their hard work.
Why not just use JupyterLab with GitHub? You get all those enterprise features with GitHub. And you can augment the shortcomings with auxilary tools. E.g. JupyterLab git extension & ReviewNB for notebook version control.
First of all, we use your product and is amazing. Thanks for building it. I think JL + GitHub is just one part of the equation, I can divide it in 3 levels: Notebooks + CI, Notebook + CI/CD, Notebooks + CI/CD + Model monitoring. In first level when I'm developing, JL + GitHub, I need to provide security (Access to Datasets, tables and resources (GPU, machine type) Currently with Cloud Products I can achieve partially that. Since I have data in multiple clouds, I don't have a central place to fine tune the permissions. For now is ok, but becomes more of an admin burden where I setup with IAM access for each user.