I've been wanting to ask what oh_sigh asked for a long time but 'reply' option wasn't available for your pinned comment, Do you disable reply for your comment on some posts?
Since darktable introduced the scene referred workflow I would argue it's an extremely steep learning curve for anyone coming from display referred software (Lightroom, Capture One etc).
Scene referred is very much a cine-thing and I know of no other still image software that uses it. dt v3.6 made scene referred default.
Aurélien has been one of the driving forces behind the switch to scene referred and his hour long explanations of the new modules can be found on his YouTube channel[1]. Boris has a YouTube channel[2] which is quite the opposite. You watch him edit in silence and you will pick up the craft.
RawTherapee has a more traditional, display referred, approach. If you want local adjustments (which you will want), you need to use nightlies until 5.9 is released.
For anyone caring just enough to wonder what scene referred and display referred are talking about…
scene referred does its filtering and mapping work on the pixel values in a linear space, presumably linear to "photons observed" but I could be wrong there.
display referred does its filtering and mapping on pixel values which have already been through a nonlinear function to make them appropriate for display.
The scene referred people say they don't tend make silly skin tones and avoid "rat piss" sunsets as well as reducing artifacts like halos around blurs. (I presume that means the blurs are something like Gaussian blurs performing linear operations on the nonlinear data.)
> scene referred does its filtering and mapping work on the pixel values in a linear space, presumably linear to "photons observed" but I could be wrong there.
You are correct. If you see Y light in real life, and that corresponds to a pixel value of 1000, 2Y light should correspond to 2000.
> display referred does its filtering and mapping on pixel values which have already been through a nonlinear function to make them appropriate for display.
The nonlinear function is usually 0 brightness to 100% brightness of the media (for instance, pure white on a printed piece of paper or pure white on a display). Scene referred photos technically have no bound (there is clipping, but you can just expose less and increase the exposure in editing to effectively get the same thing)
Of note is that you have to change a scene referred color space into display referred color space anyways...
I have to admit I still don’t get Filmic. I have a basic grasp of it but I am much more comfortable with shadow and highlight adjustments. I have watched pretty much all tutorials I could find but I am struggling big time. I have no idea how other people feel about it but I am starting to wonder if the scene refereed workflow is more an intellectual exercise and less a practical tool.
Filmic mimics the nonlinear dynamic range response on film medium to push the contrast and dramatic look present on the shot.
It also highlights how old film struggles when compared to much more capable CMOS we have today.
I use filmic rather frequently to get the look / emotion response for a shot, however it's not always suitable. Sometimes the lighting is not favorable for it, sometimes the shot I have at hand is not well suited for that because of other reasons.
Shadow, highlights, exposure, etc. are corrective tools while Filmic tends to be an artistic one.
None of what you said makes any sense. For people like me, here is an explanation of scene referred workflow
"The scene-referred workflow places an emphasis on performing image processing in the linear scene-referred part of the pixelpipe. This helps to reduce artifacts and color shifts that can result from processing non-linear pixel values and, by decoupling the image processing from the characteristics of a specific display, it makes it easier to adapt your work in the future to new display media, such as high dynamic range displays."
gist is whether you do your color grading in the camera's original color space (scene referred) or after transforming to display color space (display referred)
> you do your color grading in the camera's original color space
you do your color grading in linear color space (or something that roughly maps to linear color space) is what that actually means. Linear meaning, the exposure of a pixel is supposed to be proportional to the data number, and the exposure is theoretically infinite (since you can have anywhere between 0 and infinite light). For display color space (for photography this is more in line with "print media color space") there is a maximum brightness and minimum brightness, and you have all the numbers to map brightnesses between the two.
The math is a lot cleaner with linear color spaces.
However I wish that more modern cameras and lenses would be supported.
Darktable depends on the nearly-abandoned lensfun library [1] which doesn't support many cameras and lenses, and has open pull requests that have inexplicably not been merged despite being open for years [2].
Also, the process of contributing calibrations to lensfun is a mess. First, the only way to gain access to create calibrations is to email a single developer, Torsten Bronger [3]. Second, it requires a highly manual and technical workflow primarily documented via a screencast video [4]. Oh, and you may possibly have to compile a custom version of Hugin by copying a patch from the documentation file.
So it is no wonder that few volunteers are willing to contribute calibrations!
By the way, this is not meant as a jab against Dr Bronger or any other maintainers of lensfun. I am deeply appreciative of the work that these guys have done. It's just that, despite the great work, it's not nearly scalable enough to keep the project going in an effective way.
Yeah, I often do the calibrations in Python (OpenCV for distortions and rawpy and a flat frame for vignetting), spit out a tiff, and then do the rest in darktable. Lensfun is complicated so I never tried to go through that. Libraries are supposed to simplify things and it doesn't.
Adobe letting Lightroom "Classic" sit and rot for years, along with increasing the price substantially has really opened the market up to competition. I'm currently using PhotoLab 4, but that's mostly because workflow wise it is "good enough" while offering killer features like DeepPrime and ClearView+ (that neither Lightroom Classic nor Darktable have an answer to yet).
Long term though I suspect Darktable will be the final victor if they can find a consistent group of contributors. The biggest limitation right now is lack of proprietary plugin support, for example if they had PureRaw Plugin support I could use Darktable instead of PhotoLab 4 today.
I am also on PhotoLab and have been very happy with it, the killer features for me are the spot removal tool and the perpetual license model. Darktable would have been able to satisfy both of those features, but when I tried it years ago, I had to go back to the menu to select the tool for each spot I wanted to remove and it was just not workable. Darktable had many interface annoyances of this sort and I am not sure if it has improved since.
I've just recently been combining Photolab with Darktable.
DeepPRIME really is a killer feature. It produces images that are basically noise-free without turning the whole image into a smooth featureless blob.
But I also really like Darktable's Filmic RGB module and colour management (you can output P3 images and actually make use of the wide gamut displays that nearly every modern device has).
So I've taken to this process:
- Cull my images and putting everything but the ones that make the cut into a "Rejected" folder. I used FastRawViewer at first but I've warmed to Digikam.
- Run DeepPRIME and DxO's optical corrections with Photolab and export a DNG.
- Do final adjustments and exports in Darktable.
Basically, Photolab allows me to use Darktable but skip Lensfun (which others have mentioned is a bit of a weak point in Darktable) and push my shadows as far as I want without noise.
I agree, the tools it provides are very powerful. I also really appreciate that it's less opinionated than other photo editors, it makes me consider what I want the final image to look like before I start working on it.
I recently started on my photography journey and is just starting with darktable. My biggest challenge is around multiple modules with apparently very similar functionality. It has been very hard to figure out what should I do to get to the end goal I have in mind. FWIW, most of my work is in underwater photography and what I really want is adjust exposure, white balance and change the de-saturate aqua and blue in masked areas and I got struck with filmic, RGB and a couple of others, with none being easy for a beginner.
I use Darktable for almost everything. Except for a few occasional UI freezes it's amazing. Some samples of photos I have taken and processed in DarkTable
RawTherapee for me has much more intuitive ui which you dont have to learn much, you just pick it up. On the downside it has a bit less features. One feature i'm missing in particular is retouch but apparently its already there in non-stable builds and will be released in the next version. Darktable also didnt have matching color profile for my Sony camera which RawTherapee had out of the box. RawTherapee seem much faster and responsive after adding some filters and edits on the photo. I'm not a professional btw.
I've used and I loved it, but then I upgraded my camera and this software does not support new Canon CR3 RAW files yet... Sadly I'll switch to Lightroom until it does...
It looks like the ability to read the CR3 format may be available in freshly-compiled distributions of darktable soon, at least for some cameras. It relies on the exiv2 image library, which added support[1] for CR3 in v0.27.4.
That said, there are some notes in the darktable issue[2] that suggest that the darktable rawspeed library also requires some updates[3].
There seem to be many darktable tutorials on Youtube so which ones do people recommend? I've watched a couple of intro ones but looking for good ones on digital workflows.
I wonder if they’ve finally fixed the performance in macOS. Earlier versions had a redrawing issue where the GUI was being constantly updated upon mouse move…
VScode is at least free as in beer (with VScodium being free-free, if a little less convenient to use). Lightroom is a paid program that requires getting into Adobe's whole subscription mess.
Darktable still has a GUI. It isn't nearly as barebones or discovery-hostile as vim (I love vim but there's a learning curve). I used darktable to touch up some photos, and was able to stumble through as a complete novice.
IMO a better comparison would be Blender vs <insert your favorite proprietary 3d modeling tool here>.
It is free to try; my recommendation to anyone who is curious about it is to give it a try and see if it works for you.
I use Darktable exclusively for all of my photographic work, and have for years. It continues to improve year after year and the backward compatibility has been wonderful; anything I edited in the earliest days with Darktable still renders the same today.
Darktable 3.6 - https://news.ycombinator.com/item?id=27720908 - July 2021 (73 comments)
Darktable 3.2.1 - https://news.ycombinator.com/item?id=24156113 - Aug 2020 (47 comments)
Darktable 3.0 - https://news.ycombinator.com/item?id=21874528 - Dec 2019 (120 comments)
Darktable 3.0 Approaching with Many New Features - https://news.ycombinator.com/item?id=21440612 - Nov 2019 (1 comment)
Darktable 2.4.0 released - https://news.ycombinator.com/item?id=16012499 - Dec 2017 (74 comments)
Darktable 2.2.0 released - https://news.ycombinator.com/item?id=13261849 - Dec 2016 (82 comments)
Why don't you provide a Windows build? - https://news.ycombinator.com/item?id=12094038 - July 2016 (3 comments)
Darktable 2.0 released - https://news.ycombinator.com/item?id=10789390 - Dec 2015 (44 comments)
A look at Darktable 2.0 - https://news.ycombinator.com/item?id=10640753 - Nov 2015 (10 comments)
Why there are no darktable builds for Windows - https://news.ycombinator.com/item?id=9883018 - July 2015 (1 comment)