> The technology that’s replacing the green screen [...]
This is only replacing matting (which includes blue screen) in a very specific type of photography for now. Namely when the background is out of focus.
You aren't. In fact, as a non-native, it's difficult to not misread, as I didn't even know what matting was. And "real-time" and "human" go perfectly well with mating.
What tech? There is no source on their github and natural image matting has been an area of constant research for 20 years. You can search for bayesian matting, poisson matting and closed form matting for early influential papers.
It would be Freudian if there was some subconscious tendency at play to misread it. But here even though "mating" is sexual, the misread is quite easy and effortless because the word "matting" is an obscure industry term that looks totally like "mating"....
Could be tiredness as well. When I saw the headline on the front page earlier, I read the word correctly. When I alt+tabbed to it just now, being tired, I did a double take because I read it as "mating".
The real problem is not having first party solution to use our damn phones as webcams across PCs/Macs natively, without resorting to flaking IP cam software or virtual loopback devices which are laggy if they work.
I've been working on an MacOS and iOS app to do just this if you have an older iPhone lying around. https://telecast.camera/
I've been using it to record multicam music performances/practice sessions as I didn't have an expensive DSLR. The latency of other solutions was too high to easily sync guitar strumming and audio-in from an audio interface.
I'd love feedback! Things like the linked network are of interest, as I'd like to add support for background replacement in the future, too.
If you want to transmit the camera stream wirelessly in real time that effectively makes the phone an IP cam, with all the flakiness that often entails. It's not an easy problem.
I'm also not quite sure which problem is being solved by using your phone as a webcam. Actual webcams are quite cheap and come with way better mounting hardware than your phone. And for freehand videos like in the paper the realtime requirement isn't very common, and if it's there it's usually sufficient to do the processing on the phone and stream directly from there. There has to be something I'm not seeing?
Part of it is that webcams are nearly impossible to buy since the pandemic arose, but apart from that, webcam image quality is terrible - even expensive webcams in the price range of ~100's of Euro.
A recent iPhone will wipe the floor with any webcam I've come across, short of an SLR or mirrorless camera.
For what it's worth, the NDIHX software works well on iPhone and Mac through OBS, and is at least available on Windows.
I'd like Macs to have an official loopback device so video effects wouldn't have to be implemented in every program separately. For example an overlay that worked in Facetime/Zoom/Teams/etc.
Isn't that sort of how SnapCamera and the like work today? You give the physical camera to Snap, which does the video processing and then creates a virtual camera device that you can select as your camera in in zoom. It's not exactly as you describe (and FaceTime doesn't respect/offer it), but Zoom doesn't have to implement the effects and can use the device still.
Yes, but it's fragile. Zoom has disabled whatever permission is needed for a virtual camera in the past, and I think Skype requires modifying the binary to enable it. The OBS virtual camera seems to work well now, but that's a recent change and I'm not sure I trust it to keep working on new versions of macos.
I agree! Recently I have been wondering if there is any reason why for example Android devices with a camera don't implement the USB video class specification, except for the reason that nobody cared enough.
Even low end android phones take selfie video of way better quality than any built in laptop Web cam I've had experience with. Also unless you spend 200eu for the top end cams you won't get near that quality. At 200eu you can actually get a pretty decent mobile phone like oneplus nord which also has a decebt camera.
> We first plan to publish an online image/video matting demo along with the pre-trained model in Dec. 2020.
We then plan to release the code of supervised training and unsupervised SOC in Jan. 2021.
We finally plan to open source the PHM-100 validation benchmark in Feb. 2021.
This sounds awesome so don't take my comment as a criticism, but it is interesting that it highlights a difference between academics and non-academics - academics often publish papers before code, whereas non-academics/hackers publish code, and then probably never get around to writing the documentation :)
How Green Screen Worked Before Computers
https://www.youtube.com/watch?v=msPCQgRPPjI
Hollywood's History of Faking It | The Evolution of Greenscreen Compositing
https://www.youtube.com/watch?v=H8aoUXjSfsI
Zoran Perisic's 'Zoptic' front-projection system used in Superman: The Movie
https://www.fxphd.com/fxblog/effects-of-days-past-making-sup...
https://www.youtube.com/watch?v=mbXC16p8tNc
The technology that’s replacing the green screen
https://www.youtube.com/watch?v=8yNkBic7GfI
Mandalorian 'Stagecraft' technology
https://www.slashfilm.com/the-mandalorian-stagecraft-photos/