Hacker News new | past | comments | ask | show | jobs | submit login
Why can’t you buy a good webcam? (vsevolod.net)
783 points by murkt on Dec 22, 2020 | hide | past | favorite | 773 comments



The reason you can’t buy a good webcam is the same reason you can’t buy a high quality monitor outside of LG’s unreliable apple collab.

The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them. They simply rebrand commodity cameras and IPS panels in a crap plastic housing and slap their logo on it.

Then they give the product a hilariously user-hostile product name, like “PQS GRT46782-WT” as an extra f-you to the user.

They don’t care about you because they have no ongoing relationship with you, and their executives mistakenly see their own products as commodities.

Combine this with the fact that most home users don’t care about good quality or even know what it is, and you have the current situation.

A friend once described the peripheral market as “Assholes selling crap to idiots.”


> Then they give the product a hilariously user-hostile product name, like “PQS GRT46782-WT” as an extra f-you to the user.

That is also a strategy to prevent product comparisons and unbiased reviews. They quickly cycle through product names and sell a certain product no. only in a limited geographical area.

Doesn't matter if a consumer org/magazine/someone on reddit/your friend/etc. does a review. The product will be out of market by the time you read it, or will not be sold in your country. The similar looking product you find on the shelf might be the same, or it might have something completely different inside.


That's exactly what I have been telling folks about Apple vs. other laptops. Apple only has a handful of laptop models for sale and they don't even change that much across generations. Furthermore they seem to exhibit hardly any manufacturing variations within each generation. That means that if there is a problem (and yes, there were a few big ones), everyone is affected the same, everyone is screaming about it, the majority of customers are not corporate customers, and eventually a class action lawsuit is set up and Apple will often (grudgingly) offer to fix/replace broken units for free, like what happened with Staingate or the Butterfly switches.

Now compare that to buying a model from Dell or Lenovo, where the current product lineup is already 2-3x the size, the models are sometimes discontinued, sometimes changed significantly between refreshes, often refreshed annually, oftentimes configurable in a meaningful way (1080p non-glossy vs. 1080p privacy screen vs. 4k glossy vs. 4k touch screen), sometimes just available in certain geographical locations and they exhibit more intra-generation manufacturing differences. The chances of finding other folks with your exact same permutation (and same day of the week it was manufactured) of these options are much smaller, so you stand less of a chance of getting something recognized as a fundamental manufacturing issue which should be covered for free by the vendor. Plus, even if you can get repair/replacement for free, you still fear that your specific model has a flaw, so you might only get lucky after having it replaced 2-3 times.

I've seen it happen with Dell and Lenovo where folks sent back brand-new units repeatedly because the first one had overheating issues with the SSD, the second one had really noisy capacitors and the third one had a display cable that wasn't seated correctly. At least with Apple I know that if I'm getting screwed, I'm in the same boat with everyone else ;)


That last paragraph is indicative of terribly quality and quality control.

Fortunately that hasn't been my experience with the recent Dell, Asus, Lenovo and HP laptops I've purchased. Each have been without any issues at all.

But my point here is that... it sounds like you're arguing against consumer choice. You can have your Model T in any color as long as it is black. And this is Apple's model. You can have your product in any configuration as long as it's the one configuration Apple offers. Apple tried this, actually, for a long time. The iPhone started out with extremely limited configurations and only more recently branched out beyond 2. In a way, I agree with the confusion, because I know now I can't just say "Macbook" because there is at least one Macbook without suffix, at least one Air, and the Macbook Pro has a myriad of configurations - different sizes, with or without touch bar, etc.

Why did Apple start offering more options? Because that's what consumers want. Dell exists exactly because they were the first big PC manufacturer to accept configuration orders, and then (relatively) quickly manufacturer and deliver those custom configurations to users. Consumers want this. As Apple's share of the market grows, they will have to meet the consumers where they are - or their market share will be limited by the limitations they place upon themselves.

Now, I agree that, for example, the variety of models between different geographic locations is - if nothing else - annoying. Especially when nicer options aren't offered in your location! But I don't agree with the offered example of getting bad replacements. Maybe buying one laptop a year isn't enough to experience these issues.


Actually from what I've heard from Linux distro maintainers (mainly from Fedora) it seems like Apple actually does quite substantial hardware revisions internally without telling anyone as users report that their MacBook 20XX does not work with a Linux distro while another users report that on the paper the same device of the same model year works fine. IIRC the estimate was about 4 hardware revisions per year.

This really complicates the already limited attempts to run Linux distros on the Mac hardware as users simply can't reliably tell if their hardware will be compatible beforehand.

On the MacOS side they likely paper over the differences, quirks and bugs in firmware and drivers, so the user does not native anything and they can change the hardware underneath as needed.


Consumer choice only means anything if the consumers can choose something they know about.

If the choice is between "you can know about it" xor "you can buy it", there's no real choice.


Apple has it's fair share of issues, and it's often a pain to diagnose them remotely without the user looking up the specific model code which isn't all to easy to identify. They often don't change any visual appearance and certainly don't distinguish between different models in their marketing. It's easy for me to look up what common issues Lenovo's had with a specific T series model, but if I'm buying a used MacBook, it's hard to know what I should be searching for until the seller has told me the exact model number. And even still, there's variance between machines of the same model because sometimes different panels and SSDs are used for the same model number.


> without the user looking up the specific model code which isn't all to easy to identify

My 7 year old macbook has "A1465" written in perfectly legible text on the bottom. "About this mac" has the serial number two clicks away, which is convertable online to exact specifications.


My old laptop from 2006 had nc8430 written close to the screen hinges. The new one from 2014 has ZBook at the bottom left of the keyboard. Maybe it's one of the differences between $300 laptops and $1000+ ones.


Except what you said is not at all true. Dell, hp, and Lenovo have a whole bunch of different models for different laptops. They're also very customizable, so the model may be slightly different if you want different hardware. But you're arguing for less consumer choice. Dell, hp, and Lenovo do not swap models around to try to fool consumers. Quite the contrary; you can search a very old model number and find exactly what you're looking for on their site.


They all cater a different need. I donated few laptops to a school in India. These were some of the cheapest models Lenovo had to offer. Lenovo Dell etc. cater to a much wider range of market that apple does not even bother to target. You are basically comparing Tesla with Camry and upset that Camry is not as good.

On other hand Lenovo X1 carbon is a pretty solid high end laptop along with LG gram. Former specifically is far more customizable, repairable, upgradable and also around 30% cheap while being more powerful.


You think that’s bad, just wait till Apple starts selling cars.


And scooters.

and revamping the US highway system with assisted driving beacons/lines etc - because they're the only ones with enough money to do so.


That raises a very interesting point: The highway system. Who owns it? Who can “attach” to it? What you said made me think about line access and comcast using its mass to keep out Google Fiber etc. I really, REALLY do not want that battle to take place over roads.


Apple had a lot of problems with overheating. The i9 version of MacBook especially, it throttles under slight load.


And you have said literally nothing against the deeper point the parent is making.


The geo-locking of model numbers is one of the vilest practices I've seen. I don't see it going away for any reason, it'll only be possible to combat by intl. legislation... but how do you even legislate that?


I don't see it going away as there are valid reasons for doing so for small but important market differences.

If you're making some device (for example, washing machine) which has a power cord and a knob for some mode selection with writing on it, then for the exact same internals you need different models where the power cords are different (USA, UK, Germany and Japan each require different plugs) and the writing on the device is printed in different language and with customizations such as Celsius vs Fahrenheit. You can't sell the exact same laptop model in every market because the keyboard layouts are different. Etc.


Just append the region identifier to the model number. Internally they clearly need to track the products separately, and you don’t want people getting the wrong product accidentally.


> then for the exact same internals you need different models where the power cords are different

The IEC 60320 connectors were specified for exactly that reason. Honestly, I don't get why these were not made mandatory for all kinds of appliances. There are even locking variants available if vibration is of concern.


> The IEC 60320 connectors were specified for exactly that reason. Honestly, I don't get why these were not made mandatory for all kinds of appliances. There are even locking variants available if vibration is of concern.

I'm not sure what you mean by the second sentence but you can't use most appliances made for Europe in America and vice versa. Most electronic appliances depend on the input voltage and supplying 240V can easily cause a fire. That is true for almost all electronic appliances (water heater, fan, washing machine, etc) but not true most "computer related devices" such as a monitor, PSU, charger. Since those devices already operate on a much lower DC Voltage, they often have transformers (not sure if that's the right word), that can scale down the current from either 120 or 240. [0]

That being said, a mandatory IEC connector (and it's variances) would help a lot to cut down unnecessary e-waste. Instead of throwing away a device because the cable is damaged, you can easily order a replacement that is around $2 and high quality, instead of relying third party cords that might have bad wiring from a non reputable brand. The reason they are not mandatory, though, is that most companies like to have their own connectors so that you either overpay for it or just buy a new device.

[0]: You should still always read the specs on the input current for the device though. It is dangerous to rely on the fact that similar devices can operate at 120V/240V because yours might not. You can usually see the specs on the website/packaging or usually near the input plug.


> I'm not sure what you mean by the second sentence but you can't use most appliances made for Europe in America and vice versa. Most electronic appliances depend on the input voltage and supplying 240V can easily cause a fire.

Here in America "electronic appliances" would imply the tech/gadget category like TVs or computers where "electrical appliances" would be the big household equipment. Just to clarify in case that confuses anyone else like it did me, it kind of reverses the meaning of what you're trying to say.

Anyways, at least with relatively modern gear you can generally assume that anything with batteries or USB ports runs off a switch-mode power supply, and all but the cheapest of those will happily accept pretty much anything resembling residential power.

Anything with a large motor or any kind of resistive element (lighting, heating) on the other hand is almost certainly built for a specific variety of electrical service and will likely require modification to accept anything else without releasing the magic smoke.

The stuff in between those categories, well, RTFLabel. Outside of audio and ham radio gear I'd imagine most DC stuff runs on switch mode power supplies these days.


I wonder if there are regulatory reasons that prevent IEC connectors being used in e.g. washing machines. I guess getting a water ingress protection IP rating might be harder if you have an IEC connector. The lack of an IP rating might prevent you from then installing the appliance in, say, a bathroom (depends presumably on country-specific regulation). This in turn might limit your sales.

Even if that's the case, the appliances should be easy to repair for a competent person and if necessary allow the cable be replaced.


My cheap hair drier has a switch to select the input voltage (you need to turn the dial with a screwdriver). For many devices it shouldn't be too hard to make it possible to use them both with 110V and 230V, even more so for already complex and expensive machines like a washing machine.

The biggest problem might be the amount of power a device can draw. Half the voltage gives you half the power, which is the reason why e.g. kettles are much less useful in the US.


For resistive loads (like the heating element in a kettle), half the voltage gives you a quarter of the power. (Electric kettles work just fine on 110/120; they just haven't been a thing in the US. They've been ubiquitous in Canada, although they've been pushed aside somewhat by drip coffee makers. You just need a lower-resistance heating element than would be practical with 220/240.)


Electric kettles in the US typically are 1000W-1500W, while in Europe any kettle is 2000W-2800W. This is simply because houses are typically wired with outlets for 10A-16A everywhere, regardless of grid voltage.

That ofc makes electric kettles much less useful in countries with 110V grid. It also keeps stovetop kettles relevant in these counties, since stoves don't suffer power limitations.


Houses in the US are typically wired with 240v split phase, so nothing is stopping a mad lad from installing a 240v outlet in their kitchen, and running a kettle from it.


This just seems so crazy, to think of 120V. I haven't seen a house here with 120V in my entire lifetime. It sounds like a relic from the past, like the kind of thing people used back when they rode horses.


Hmm, I'm having trouble understanding what you mean. 120V is the standard household plug in North America. If you're in, say Europe, I don't imagine you'd see a house with 120V as IIRC, 220V is the standard, and you generally have to connect to your local grid. I think the parent's post about 120V was about international devices that may have to work in North America, for instance.


> you need different models where the power cords are different

That should only affec the power brick, and hence the overall SKU, not the notebook itself.

> the writing on the device is printed in different language and with customizations such as Celsius vs Fahrenheit.

What writing is there on notebooks except for keyboard labeling and the product name? The certification label on the back already shows all labels for all countries.

> You can't sell the exact same laptop model in every market because the keyboard layouts are different.

Keyboard layouts are somewhat orthogonal to regions. I'm German, but use US-layout keyboards.


Japan uses the same power plugs as the USA.


Legislation can mandate the conspicuous publication of a clear indication of the difference between models.

Now, companies can of course lie about this, in theory, but that's a bit like car manufacturers lying w.r.t. emission tests - possible, but you tend to get caught (cf. the recent Volkswagen case) so it's probably not worth it.


Here’s my lazy take: the best governing body to start petitioning about this is probably the EU. If I remember correctly, they already have some of the most consumer-friendly laws on the planet, e.g. w.r.t. planned obsolescence.


Eben Apple sells different iPhone versions on different countries. It’s mostly frequency bands but Chinese iPhones have a second sim slot instead of an eSim as far as I remember.


Unfortunately, regional market segmentation is something that international trade agreements have encouraged, not prevented.


Meh. It's easy for a reviewer to research online to see comparable models.


It also makes it possible for big electronic stores to give price guarantees as the webshop undercutting them has monitors with a different product number.


It becomes even more unawesome when the same product identifier actually has different parts in it.

When looking for a nice display a couple of years ago I clearly remember reading about some that were promising yet getting confoundingly variable reports, until some tore their hardware apart and revealed that the internals were different.


Tip: try to find out what panel the monitor you are buying has. Then look the panel up in a database like panelook.com. This way you can get the specifications without any marketing bullshit.

It also works the other way round. Find a panel that is good enough for your eyes, then see if there's a mass marketed display with that panel. If you are adventurous, you can grab "DIY" or "assembled" monitors with the panels on Chinese e-Commerce sites.


Isn't there a chance that the assembled Chinese monitors actually use second grade panels that the big makers wouldn't accept?

I remember getting a 27-inch 1440p display from a Chinese manufacturer for really cheap back in high school. It should've been the exact same panel as was in Apple's iMacs. However, the were some quality issues with it long term and it's definitely suffering from burn-in that I don't think the iMacs suffer from.


FWIW Planar makes 27" monitors that use the same 5K and 2K panels used in the iMacs, down to the bonded glass surface. The Planar IX2790 and PXL2790MW respectively. I have the PXL2790MW and if you look closely you can see the glass peephole for the nonexistent iSight Camera. Not sure if it's B grade panels that Apple rejected but it's flawless, maybe I just got lucky.


It is hit or miss. I own two, the one is flawless (apart from the retention which is the norm apparently in LG displays). The other I have exchanged it 2 times and still have issues with many many dead pixels. So in my case 3/4 were bad apples.


I have a couple of these from 2014-15, and they are very, very nice (and as a plus, they matched the dpi of some macbook models at the time, at least). One surprise: they were very heavy compared to other, similarly-specced monitors.


The only issue I had was coil whine coming from a choke on the power supply inverter board. I resolved this by cracking open the monitor and encasing the choke in two part epoxy.


That was the case with noname Korean/Chinese monitors a decade ago that used high quality IPS panels found in Apple and other professional displays - they used rejected panels, which had various issues (mostly dead pixels afaik).

https://techreport.com/review/23291/those-27-inch-ips-displa...

But overall, they were a great purchase quality/performance/cost wise.


Yep! Still using the "Auria" monitor that I purchased at Microcenter back around 2012. Cost maybe $300-350 for a 2560x1440 IPS monitor at a time when you were easily looking at $500-800+ for a similar panel from a name brand.

Now, if you were a professional, that quality control and warranty (not to mention better ergonomics, etc.) were easily worth the added cost, but for just "some dude who liked playing video games and doing some photo/video editing", it was a great bang for the buck.

I still use this as my main monitor and haven't noticed any dead pixels (if there are any, they're so hard to see that they may as well not be there). It's not the best monitor out there and you can probably get a better 2560x1440 display for less now, but at the time it was a big improvement over the cheap 1920x1080 display that quickly got demoted to secondary (and has now been loaned indefinitely to a teacher friend who needed a second monitor to plug into her laptop for online classes).


Heyo another Auria user here! I actually recently upgraded to a Dell 4k screen, but that Auria served me great for several years and is my secondary setup screen. Got it used for $150, amazing value there!


That brings back memories :). I ended up buying one of these and apart from some weird quirks (only wanted to work over DVI and not with an HDMI-DVI dongle), the image quality was great and so cheap (for the time).


I'm reading this on an IPS panel I bought a decade ago from South Korea. Works great, but with a bit of light bleed in the top left hand corner. I paid extra to have one without dead pixels.


Bought mine on Amazon for $400 six years ago. It stopped showing a picture but I get a white flicker at the base of the screen every few seconds.

I could (and probably should) investigate fixing it but it was easier to buy a 2160p Philips for $240. Only issue with the Philips is it doesn't have a VESA mount and it would be difficult to make some sort of jury-rigging work.

I run them attached to a Mac Mini and use the DisplayPort on the monitor. At one point I believe HDMI (or maybe just the Mac) wouldn't do 1440p. I'm copying stuff from an Intel Mac mini to an M1 and I'm able to toggle back and forth using HDMI for the Intel just fine.


Yeah, a cheap 1440p as a student was certainly a great thing when they were still rather expensive, even though the base was wobbly as all hell and it later developed some issues.


Yes, so often I ask if the seller can provide a "perfect" display, that is, without any artifacts on the display. This adds 100-200 CNY to the price.

There's a Chinese panel manufacturer called BOE that makes products competitive with some of the lower-end Samsung / LG panels.

I got one 15.6" 2160p external display with a BOE panel that offers 100% sRGB coverage. I can see a huge difference compared to my Dell Latitude laptop display.

Now if anyone can find a source of 55" 4K OLED panels, that would be the one ultimate display. Combine it with a VBO driver board and it becomes better than any smart TVs.


>I can see a huge difference compared to my Dell Latitude laptop display.

And outside of a few occupations that might actually require pixel-perfect colour, what does this matter? Is this like the audiophile world, where people argue about seemingly subjective things that no else cares about?

The customer interprets colours differently than you, the customer sees colours differently than you, and the customer is using a monitor that almost assuredly displays the colours differently than yours. And the world continues to turn.


I don't think the audiophile comparison makes really sense here (and I like to mock audiophiles more than most) simply because display technology still has a long way to go before it reaches the level of audio when it comes to "bang for your buck".

CD quality audio is less than 1 megabit per second per channel, uncompressed. HDR (10 bits per component) 4K60fps 4:2:2 video is around 10Gbit per second of data.

Of course data bandwidth is only a small part of the problem of correctly reproducing an analog signal, but it gives you the orders of magnitude we're dealing with.

I currently use a cheap ASUS 4K display. It's more good enough for coding, but I wouldn't trust it for any sort of graphical work. The viewing angle is pretty bad, so depending on what part of the screen I'm looking at I see colors differently, and some gradients become more or less visible depending on which part of the screen they're on. Contrast is pretty bad, making even some videogame display poorly: depending on the location and time of day contrast seems always too high or too low.

You can buy a good sub $100 pair of earphones and a sub $50 DAAC and they'll be good enough to do 99% of any audiophile work you could ever want to do reliably. If you want to do serious graphics work without having to constantly adjust for your display you'll have to go for something a lot more expensive than an entry-level monitor.


These differences are very clearly noticeable. I upgraded many years ago from a 72% sRGB to a 99% sRGB Dell IPS and everything looked much better. I just got the LG 27GN950 which is 95% DCI P3... I was mainly getting it for the 4k/144 with the P3 as a nice bonus (I already had 4k/60 on the Dell). Looking at the Dell, I was thinking that P3 might be nice to have but it wouldn't really matter much aside from photo editing - the colors on the Dell already looked great.

I just unboxed the new monitor 2 days ago. The richer color was immediately noticeable, and when I looked at some random photos I took with my phone recently I was blown away by just how red and green and yellow/blue things were. Like a completely new realm of color.

It's one of those things that you can't appreciate until you experience it (same going from the original 72% to 99% sRGB).

The Dell was $450 for 4k, 2.5 years ago. The new LG was $800, but you can find 60fps P3 4k monitors for around $500 these days iirc. If you're on Hacker News you probably use your computer a lot. Unless you're running low on cash, upgrading to a great monitor is worth it.


Seconded. I have 2 LG 27GN950-B's on my desk, and love the 27" 4K HDR @ 144Hz experience (at least on Catalina. Big Sur has completely broken DSC and will only do HDR @ 60, non-HDR @ 95).

I love them for my photo editing.


I can't get 144hz at 4k. Any specific cable you're using?



monitor image quality is quite a bit more objective than what audiophiles look for in high end audio equipment. sRGB defines a specific physical color that ought to be displayed for each RGB sequence. if you can get a very accurate display for <$1000 just by doing a bit of research, why wouldn't you?


A similar thing to audiophile is that better quality doesn't always mean improve QoL for just a consumer (not designer or similar use). Sometimes I think that it's happy if I satisfied with $100 headphones or cheap laptop display quality.


> And outside of a few occupations that might actually require pixel-perfect colour, what does this matter? Is this like the audiophile world, where people argue about seemingly subjective things that no else cares about?

I'm a color blind person and even I can see a color difference between cheap displays that I have at work and an old EIZO one that I bought years ago at home.

I can more accurately diffrentiate between different colors/shades on my EIZO panel.


I enjoy having a high quality display for all kinds of reasons. Better comfort while programming, accurate colour representation while looking at photos, having a good sense of what things might look like for others (accurate colour means you might be the middle ground of your users experiences, inaccurate colour means you can’t be sure at all), and otherwise, if I’m going to spend a lot on something I’ll own for half a decade I would prefer to get something accurate. The price difference isn’t sufficient enough to justify saving a little bit to have a poor colour experience.


> accurate colour means you might be the middle ground of your users experiences, inaccurate colour means you can’t be sure at all

Having a setup with multiple cheap monitors is imho really underrated for design and development. Moving something between screens and seeing clear contrast disappear, or see pleasing color choices turn ugly can be eye opening.


Agreed! Back when I was in music school, they brought in Tony Bonjiovi[0], a well-known record producer at the time. He talked about how the ultimate test of any recording was to copy it to a cassette, take it out to the engineer's Camaro with 1 broken speaker and see how it sounded there. If it sounded good there, it would sound great anywhere else.

[0]https://en.wikipedia.org/wiki/Tony_Bongiovi


Agree with your general stance. My current monitors are from 2007 and have endured many hours of use and a capacitor replacement.

I am considering my next purchase on the basis of at least 5 years of service and that is a long time to be looking at something "not quite right".


There's a Chinese panel manufacturer called BOE

...which bought Korean manufacturer Hydis (was originally part of SK Hynix), so you'll often see panels marked "BOE-Hydis".


Actually iMacs and their display counterparts in the LG ultrafine series are known to suffer from burn-in.

Google iMac or LG ultrafine “image retention” or “ghosting.” I have no idea what percent of displays are affected, but there’s enough threads about it on Reddit and macrumors to make me think it’s pretty common.


For monitors it is more than that. For a pretty expensive 144hz/1440p/gsync category that I researched a couple of years ago there were three options: acer, asus and viewsonic (and unavailable aoc). It turned out that asus, despite being a “better, much more money” brand, did a worse job of mounting the panel, so it had statistically worse backlight bleeding at one edge.


What's wrong with Asus installing alarms in their monitors? I learned the hard way being woken up in the middle of the night by a loud siren I couldnt locate the source as I wouldn't have expected it will come from a frikin monitor! There is no way to turn that off apart from physically powering it off and it happens totally random.


Oh, that’s amazing. I got curious and found this comment on youtube, sharing in case it may help or diagnose:

Battle Angel Sorry, not sure why I didn't share previously. So, I believe this is caused by using a non-HDMI cable with the audio out turned on. Either turn the audio in the display off completely in the settings, or use a new HDMI cable. The alarm is a result of the display trying to send an audio signal through a Displayport cable. Are those of you getting this alarm using Displayport cables? They do not pair audio with video, as HDMI does. I hope this fixes your problem.

https://m.youtube.com/watch?v=1i2dB8mGuKM


This is so great! I've been looking for a solution to no avail but I have not seen that one. Yes I use DP cable. I'll try that. Thank you so much!


It took me a while to find something matching those specs too. I got a couple of these when the price dropped below $300 and they're great. I had to watch a few PC Building deal Reddits to catch the deal.

https://www.amazon.com/LG-27GL83A-B-Ultragear-Compatible-Mon...


Wow I'll never shop for monitors the same again.


Yeah, if you look into it, you'll find most monitors using the same panels from LG, AUO, Samsung or ChiMei, with some outliers.

When it comes to assembled monitors, the highest failure rate is in the power supply. The components used and the cooling/ventilation play a big part in that.


Isn't it so that monitors or laptops under the same model sometimes use different panels?


Yes. Consider most people only care about the resolution, sometimes manufacturers substitute a lower cost panel that is inferior in say, gamut or response time.


This sounds compelling. I'd love to get my hands on the LG Ultrafine 5k panel in a cheaper case and just bring my own thunderbolt dock.

What sites are you finding these "assembled" monitors?


...and if you're really adventurous, you can buy just the bare panel, get the backlight inverter and "scaler board" elsewhere, and build your own custom monitor. The "3663" seems to be a common model of scaler.


Have you done this / do you have pointers to someone who has?


Yep, that's what I did. The panel is the actual product. The housing is just stuff around the panel.


Does anything else have that LG 5K panel? :D


> The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them

> Combine this with the fact that most home users don’t care about good quality or even know what it is, and you have the current situation.

It sounds apt. But... There is an absolutely thriving market for keyboards and mice.

For both, conglomerates like logitech and microsoft are selling both what you describe as 'crap', as well as higher end stuff that tries to care about quality. Possibly not in the way you think is most important, but certainly a Logitech MX Master3 keyboard retailing at >$100 is not a cheap piece of crap. The letters aren't inked on, for example, thus ensuring they don't rub off particularly easily. Not a feature that is advertised or is likely to show up in a review. The kind of quality move that doesn't make sense if the market is just 'assholes selling crap to idiots'.

Keyboards are even more interesting; a lively indie market for custom-built usually mechanical keyboards, supported by parts manufacturers where price isn't particularly important.

I agree, though - the webcam market is quite a mess. So, what's the explanation for that? Why do keyboards and mice not fall under your 'assholes selling crap to idiots' rule?


Keyboards are MUCH easier to produce than LCD displays or cameras.

The problem is that there's only so many companies that truly design and manufacture photographic sensors, lenses and LCD panels. And all the downstream "brands" that assemble this technology into cheap plastic cases to sell to consumers can only do so much to differentiate. Add neon lights for gamers. Make it look dull for business users. Etc. They also sell TVs and have thousands of other SKUs, so they really don't care about any individual product.

Apple is both incentivized (due to their ongoing customer relationships) and able to break out of this mold because they:

a) Produce a tighter number of SKUs

and

b) Do enough volume to control and change what the original equipment manufacturers are producing


I think it’s more of a “the market won’t pay for quality” problem. People won’t pay thousands of euros for a good monitor, so manufacturers have to slap together the parts available in bulk in order to reach price points people will pay. The LG 5K is a good example, because it is clearly compromised to reach a somewhat reasonable price point. From what I can tell the monitor market mostly exists to cater to the generic business monitor and pc gamer markets anyway, as those are the only parts still selling in volume.

Although I have to admit that I was equally frustrated when I wanted a good retina screen with 200-ish dpi to pair up with the mac mini I wanted to buy, only to conclude getting the 5K iMac instead was the most sensible option.


I don’t think that’s it.

Apple is the most valuable company on earth right now, entirely due to their thesis that people will pay for quality hardware.

The “creator” market is much more profitable than the gamer market where kids only have as much as mom will allow them to spend (vs The tech workers, coders, designers, youtubers, etc that need high quality displays to make a living).

It’s why Apple is able to get insane 50% margins in many products. It’s crazy to me that the big Asian manufacturers don’t see the market opportunity in catering to this crowd.

In their minds you’re either an office drone using excel or a gamer who wants neon lights. Both of which are market segments with terrible margins.


> The “creator” market is much more profitable than the gamer market where kids only have as much as mom will allow them to spend (vs The tech workers, coders, designers, youtubers, etc that need high quality displays to make a living).

This is a common misunderstanding of the gaming market due to stereotyping. The biggest age category in gaming is 18-34 by far. They generally also slant strongly to people with both more than average disposable income and higher likelihood to spend that same on gaming and related electronic toys. This makes gaming a 100 billion dollar market atm which is still growing rapidly.

> Both of which are market segments with terrible margins.

Not even close to accurate. Gaming related hardware is generally quite high margin. There's a reason ASUS et all use their gaming imprints as the place to introduce new high end parts. It's also a highly concentrated and networked market, making it very efficient to advertise to.


Fully agreed. It's impressive how "Gaming" brands upselling their products with RGB.


There are tons of expensive monitors available that cater to the professional market. The whole premise is not based in reality. Apple simply has the highest mindshare among average products.


There’s literally no good monitors with the proper resolution for MacOS outside of the fragile LG ultrafine line.

It doesn’t matter how much money you throw at the problem, nobody is making 27” 5k displays right now.


What's extremely frustrating is that Apple makes an excellent 27" 5K monitor in a nice housing for $1800. The only problem is that it comes with an iMac...

So clearly, Apple could sell a monitor for about $1600, that would be perfectly compatible with Mac Pros, mac mini and as a secondary monitor for all the various MacBooks.


iMac is due for a refresh soon, and I'm hoping that this means an M1 iMac that can serve as a Thunderbolt Display


I genuinely don’t understand why they stopped doing that...


Maybe it's related to the reason why you can't use the 5k iMac as an external monitor. I can't remember where I found the sources, but the problem was that they essentially needed to video cards to drive the thing and making sure that both sides of the output looked identical was tricky and not something that an external source would be able to do.


IIRC it was an issue with the video connection. Nothing at the time could provide enough bandwidth to support 5K, so Apple had to cobble something together. With USB-C and TB3, that's no longer an issue.


What does "proper resolution for MacOS" mean in this case? There are tons of 27" 4k that work fine in macOS in Retina mode and matches the medium tier iMac (their low end today is still 21.5" 1080p). Unless you declare everything below 5k subpar, I don't see where you're coming from.


This article from 2016 explains the problem: https://bjango.com/articles/macexternaldisplays/

Basically, the ideal PPI of mac displays is a multiple of 110 PPI. So, for retina quality you need a display of roughly 220 PPI, which is what you get from 5K at 27 inch. A 27 inch 4K display is around 160 PPI. If you use that in 2x mode, things will appear too large. If you set it to scaled mode to make things appear the proper size, there are display artifacts (like shimmering when scrolling). In fairness, it's not super obvious unless you know what to look for. But if you're already spending money on a high end screen, why should you have to compromise?


That seems pretty outdated information, given that the OOTH default Retina scaling in MacBooks have been non-2x fractional since 2016 (1400x900 for the 13-inch's 2560×1600, 1680×1050 for the 15-inch 2800x1800).


I don't have the exact numbers in front of me at the moment, but a 27" 4K monitor will not match the pixel pitch of every other Mac -- screen elements appear larger when both are set at the same scaled resolution.


Yeah, to match the dot pitch apple is designing for, the 4K monitor would have to be more like 22 inches instead of 27. We know this since the 4K iMac is a 21.5 inch screen.


I never thought I'd want a 16:10 monitor until I had to deal with all of my apps getting resized every time I unplug my MBP


I think "gamer with neon lights" is certainly a segment with amazing margins, especially compared to standard office equipment. Most "gaming" mice/keyboards/chairs/computers/whatever are just decorated and brightly coloured versions of other products with insane markups. Alienware PC's are a great example here - The parts in one of those PC's can cost around half the cost of what the company actually sells it for. Other peripherals have similar price increases once they're branded as a device for gaming rather than office use as they know consumers are willing to pay more. I realize that there are lots of kids who want "gaming" gear (that their parents will pay for) but the PC gaming market is certainly geared towards mid 20's/30's who have the scratch to be able to afford this stuff. Not that these people are stupid or misinformed for doing so, some simply appreciate the aesthetic (even if it's something you or me might not particularly like).


I don't know very many young PC gamers. Is the market for good gaming hardware really that small?


Another thing I find really annoying is when I browse a website and first have to choose a product line when I don't even know what the difference between the product lines are.


Right? Like "is this for home use, office use, or gaming?"

I guess I understand where they're coming from, when most potential customers would likely glaze over and click away if presented with a long list of specs and product numbers.

Still, it's always nice when they at least have a "show all products" link that takes me to exactly that. I want a full list that can be narrowed down with filters.


Why do you think the LG monitors outside of the Apple collaboration are not good?

Recently, I bought two 27GN950, which is a 27" 4k@144Hz gaming monitor with good colors. So far the worst part is the fan and in the long run the absence of HDMI 2.1 might be disappointing, but overall I have the impression of a good product.

Yes, it doesn't have the same PPI as smartphones, but I am not sure if we are going to see that happen ever.


I think it's in part related to fractional scaling, which hasn't been sorted everywhere. Text at 4k on a 27 inch is too small to run at 100%, too much of a waste to run run at 200% (equivalent to 1920x1080). So you're running 150% or 175% and that can be an issue if you're running something that doesn't like fractional scaling.

27 inch is perfect for 1440p 100% or 200% (i.e., 5k), but it seems like no one other than Apple has that figured out.


Yes, fractional scaling is an issue, but I don't think it is as much of an issue as it was a few years ago. In fact, I think the only application that doesn't scale for me is steam currently. Everything else seems to be handled by setting the correct DPI in the xorg.conf and the scaling factor of KDE (150%).

But as this is clearly a software issue, I wouldn't blame the hardware for it ;-)


Since Windows these days handles such scaling just fine, this doesn't seem to be a big deal for consumers.


The monitor has a fan? That would be a huge negative to me


Like Apples monitor that also has a fan?


But Apple's fans are made from recicled SR-71 blackbirds to ensure ultimate noise suppression and each fan blade is assembled by a Swiss watchmaker to ensure quality.

Just kidding, it comes from the same Chinese factory as every other fan but Apple's fans(pun intended) like to believe in magic to justify the price tag.


Yes, I wonder too why is is necessary, especially because my PC is completely fan-less :-/

However, over the day is is barely audible and it only comes to my mind when I am sitting in front of the PC late at night (+ without headphones).


I think the fan is for the G-Sync module!


Hmm... There might be some truth in that.

I've noticed over decades that high quality stuff comes out of the checks-and-balances of experts specifying and purchasing stuff.

Examples I remember are sun monitors based on sony trinitron tubes, sun/sgi hard drives that were always checked - and sometimes returned by he container - so were actually enterprise grade, not consumer grade. Lots and lots of OEM stuff like that.


Early in my career I learned important lesson, there is no point buying displays from other brands than NEC or EIZO. Preferably upper tier products. The exception from this rule was Apple Cinema Display and Some Dell models. EIZO FlexScans are reliable and rarely have any issues. https://www.eizoglobal.com/products/flexscan/index.html


Unfortunately EIZO doesn't produce a single display with the ideal resolution for MacOS.

27" 5k or 22" 4k (4069 x 2304, like the first LG ultrafine was) are the unicorns I am seeking.

Unfortunately the LG ultrafine suffers from image retention/ghosting. So there's ultimately no great displays for Mac outside of the wildly expensive Pro Display XDR.


Why are those "the ideal resolution for mac os", exactly?


For reference, the 16” MacBook Pro has a 3072‑by‑1920 display.

This means the 27” 4K monitors that are the industry standard now for some reason at 3,840 x 2,160 are almost twice the size, yet have barely more resolution than the MacBook true retina screen.

MacOS can do scaling to adjust for this, but it uses the least amount of resources in native or pixel doubled mode. Any display that is between the resolutions I mentioned (like 27 4K) requires fractional scaling.

This is more resource intensive, and doesn’t look as good as pure retina.

Here’s a better explanation: https://bjango.com/articles/macexternaldisplays/


Sorry but this has no sense at all. From 20 years on I work only with Apple based desktops/laptops. And I am a pixel peeper. Scaling is not a problem even on MacBook Pro from 2013. If you want to rationalise a purchase of new Apple Display XDR there are more factual reasons for this. Don't get me wrong - the new displays are very competitive for grading middle market, but most professionals are using separate proofing displays for testing.

I am talking about good quality display with bearable price. Not display for $6,299.00 Example: https://www.amazon.de/EV3285-BK-Monitor-DisplayPort-Reaktion...


I upgraded from a MacBook Pro with a good quality LG 4K 27” display using non integer scaling to a 5k 27” iMac with 2x scaling. Both provide the same visible screen area and not give you the same size icons and text. But the iMac with integer scaling is a better, sharper picture. The difference isn’t huge but it’s noticeable.


That seems a bit drastic but I agree with the sentiment and would also include laptops. Maybe I'm wrong since I didn't check all hardware but Apple still seem to be the only ones that have rigid quality controls, make sure parallel parts are parallel, mechanical parts stand a reasonable amount of movements. (At least as long they are not testing a new "innovation" like butterfly keyboards... ;)) That said, I buy stuff used if I cannot buy the high quality version. It falls apart anyway, this way there's no reason to be upset and it's better for the environment.


If you want to go bargain basement yes. If not, get an XPS or a Thinkpad (of the 'pro' series) and you'll have great hardware. Of course there will always be someone who finds something to complain about, but overall, these are fantastic machines.

Thing is, people complaim about there not being high quality gear, but when someone then makes it, they balk at the price. Yes people, a great laptop will cost you $3000.


I have had success with focusing on the Lenovo/Thinkpad T range. The X range seems also to be ok.


My organization has been slowly rolling out the use of Cisco Desk Pros which are hardware endpoints that connect to Webex but are in a practical sense essentially monitors with very nice camera modules and microphone arrays built into the bezel. Laptops connected to the monitor with USB C can use the camera/mic. These cost like $4000 though.

I find the video flattering (camera angle/focal length?)


You certainly can buy good high end webcams. But their market profile here is clearly still B2B and Telepresence, no Prosumer or Individual Professionals yet.

Cisco had some impressive cameras in their older Telepresence products: https://www.cisco.com/c/en/us/support/collaboration-endpoint...

But the Cisco TelePresence Precision 60 Camera CTS-P60-K9 ist Ethernet Only and needs prohibitivily expensive hardware to work with.

https://www.cisco.com/c/en/us/products/collateral/collaborat...

Next up is probably the Logitech PTZ Pro 2 or Rally

https://www.logitech.com/en-roeu/product/rally-ultra-hd-ptz-...


Dell sells good monitors.


Yeah, no. I have a 25" UltraSharp from them and it has a "fun" bug where it advertises to the system that it refreshes at 59.95hz but in fact refreshes at exactly 60hz, which leads to the monitor(!!!) Freezing for a frame every 20 seconds or so, it's absolutely infuriating in games and movies, and I only found out how to fix it by modifying windows drivers and forcing it to refresh at solid 60hz despite what the monitor advertises. But of course you can't do that with something like a PS4 connected to it. Would never buy another dell, thanks.


I fell into this trap. Now I own a very bad Dell monitor. :(


Dell sells some good monitors.

Unfortunately, you must select carefully any monitor that you purchase, and the cheapest models are unlikely to be good choices.

I am using 2 good Dell 4k monitors. One is 1-year old (U2720Q), but the other (UP2414Q) is more than 5-year old and it works as well as in the first day.


I don't buy a lot of computers, so this can change at any time without me noticing, but: I think there's basically two Dells. There's the Dell that sells the cheapest equipment you can buy. This Dell sucks as much as anyone else. Don't expect miracles. Then there's the Dell that sells upscale gear. This is usually pretty good, or at least has the ability to be pretty good. I have appreciated the ready access to service manuals and such, too.

I say this because it's unwise to hear that Dell has pretty good gear, then go to their site and buy the cheap stuff. It isn't necessarily any worse cheap stuff than anybody else, but it's not what people mean when they say Dell can have pretty good gear.


Purchased U2720Q after reading through a number of rave reviews and can definitely vouch for it. Excellent 4K image quality and works well with MacBook. It's only the monitor though, so no webcam or audio speakers.


I wish my Dell monitor didn't have speakers. macOS doesn't provide any way to hide or disable connected speakers.


Me too but with Samsung. I bought a cheapish 34-inch curved monitor that had great reviews, but it has almost as much light coming through gaps in the rear housing as the screen. Text looks like crap on its "not quite retina" resolution, especially when it's next to a retina macbook pro, although videos look very nice. I really do wish Apple would make a monitor more reasonably priced than the Pro Display XDR.


Sadly the implementation of USB-C on models like the U3419W isn’t to standard and causes known issues when used with Apple laptops.


I’m guessing this is why the cursor seems to visibly ghost when hooked up to my MBP.


I've not seen that, the bug I'm referring to is to do with putting the mac to sleep when it's connected to the screen.


Oh, I think I’ve had that: it’s a total crapshoot as to whether it wakes? I thought it was a USB power problem for the longest time.


You can check the refresh rate as follows. Open System Preferences, then click on Displays. There's a radio button, titled "Scaled". Option-click that radio button, and you'll see a pull-down menu, titled "Refresh Rate". It should be 60 Hz or higher.


Sounds like for some reason the monitor is running at 30Hz. You can confirm that using an app like EasyRes.


System Preferences can also show this, check my other comment in this thread.


Dell has very nice adjustable stands, but panels are mixed bag. eg. P2416D (1440p) is fine, but P2415Q (4K) has quite bad ghosting. So annoying I had to disable browser smooth scrolling.


Owner of P2415Q, can't say that ghosting on mine is extraordinary.

You have to choose between ghosting or proper colours, higher refresh rate IPS panels are better in this metric but still suck compared to TNs.


I was surprised to find out the Ultrasharp series includes monitors with 6-bit IPS panels. Rather noticeable, even in desktop use. Previous to that I often told people to "just buy an Ultrasharp of a size you like".


Dell used to have good offerings, but all they seem to push now is the same 27” not-quite-4K 3,840 x 2,160 panels everybody else does. Now even the 22" inch LG ultrafine that used to be 4069 x 2304 is bigger at 24" and a worse 3,840 x 2,160. The only good option for mac is the 27 ultrafine 5k.

27 4k a bad size & resolution for the current computer market. Windows scaling looks like crap, and MacOS has to do more resource intensive 1.5 scaling (as opposed to native or pixel doubling mode) to look okay on these.

M1 might make this a mute point going forward, but the fact is at 27 inches, 5k is the only monitor that will look as good as the screen on your laptop while actually giving you more real estate.


> not-quite-4K 3,840 x 2,160

But...that is 4K. It's what 4K is defined as, exactly 2x 1080p resolution in each dimension.

> Windows scaling looks like crap

I don't understand. 2x each dimension (so 1 pixel in the old resolution is 4 in the new) is, like, the easiest possible scenario when it comes to scaling in software.


While DCI 4K is a standard with 4,096 pixels of width, you’re correct that the HD standard (and therefore what is relevant to the discussion here) has always been UHD 4K and 3840 pixels wide.

DCI is relevant for movie industry professionals only, as these are the dimensions used for projection devices and (potentially) their content.


That's UHD. 4K is defined as 4096px wide. But I guess the mass adoption of the term 4K in consumer electronics changed the definition.


That's DCI 4K.

Two groups have competing definitions. One isn't inherently correct.

I say this as someone who was "that guy" when it came to HD Radio: "It's not High Definition, it's Hybrid Digital!" even though that's exactly the confusion they were trying to encourage.

Arguing that this is misleading is a fool's errand, and only plays into things if you assume that the primary purpose of a "4K" screen somehow is inherently "to play back cinematic content", which... it's not.


4k on 27inch requires you to do 175%. 200% is too big.


To explain further:

200% (2 times each direction) scaling on 4K is the equivalent of 1080p. A 1080p 27 inch monitor has huuuge pixels for the normal viewing distance of a desktop monitor. 1080p is common on 23-24 inch displays. Therefore you are forced to use fractional scaling which is less then perfect.


>But...that is 4K. It's what 4K is defined as, exactly 2x 1080p resolution in each dimension.

That's irrelevant though, except if we're talking about consuming movies fullscreen.

For a monitor I don't want 4K, I want insivible pixels at viewing distances, so hi-dpi.

I would also prefer no scaling for assets that are bitmap in nature. This ideally means pixel doubling (less cpu/gpu demanding and less fuzzy than fractional scaling).

This, for 27" and more, means higher resolution that 4K. I don't want to restrict myself to pixel-doubled 1920x1080 on my 27" or 32" monitor.

You do get nice DPI, but needlessly large buttons and other assets (compared to something closer to 5K).


I just completely don't understand your point. There's no misleading advertising here - the resolution is exactly as promised ,at the size promised....what's the problem? If the resolution isn't high enough for you....then buy one where it is? There are 5K monitors out there, maybe even 8K? Or just get a 4K one but in a smaller size?

I'm so confused by your comment.


>I just completely don't understand your point. There's no misleading advertising here - the resolution is exactly as promised ,at the size promised....what's the problem?

That would be relevant is my problem was false promises or misleading advertising.

But my problem is not

(a) "Monitors say they are 4K and they are not"

but:

(b) "Most monitors out there are BS-4K, but for the best quality/viewing comfort at their 27" and above diagonal they should rather be 5K, but most manufactures like Dell aren't bothered to produce at such a resolution and the few that do have prices to the skies".

>There are 5K monitors out there, maybe even 8K? Or just get a 4K one but in a smaller size?

Perhaps you've skiped through the thread?

My comment responds to (and agrees with) the sub-thread started by a parent commenter writing:

"Dell used to have good offerings, but all they seem to push now is the same 27” not-quite-4K 3,840 x 2,160 panels everybody else does.".


For me it's hard to believe that 4k on 27" is not enough, I use 1440p 27" 144Hz display as daily driver and barely see any pixels(usually with badly hinted fonts, and still not pixels, but uneven forms of letters), because I sit around one meter apart from it, and sitting closer makes me turn my head around too much, except when watching movies.


>For me it's hard to believe that 4k on 27" is not enough, I use 1440p 27" 144Hz display as daily driver and barely see any pixels

It's not just about "not seeing any pixels", and "barely see any pixels" is not the same as enjoying hi-res typography and fine detail.

27-inch 1440p monitor is about 108 ppi. That's hardly better from what we used in the 90s and 00s, dpi-wise. Sure, if you haven't used to hi-dpi it looks ok. But try using a 5K/27-inch monitor for a while and then go back to 1440p/27-inch to see the difference you miss.

Now, 4K hi-dpi (pixel doubled) on 27" is 1920x1080.

This makes pixels just fine and detail is great, but everything too large and cuts off screen space, as it's 33% less area than 1440p (which, I presume, you don't use pixel-doubled)

The solution is either 5K/27" (which gives you back the 1440p kind of screen space and UI control size PLUS hi-dpi), or using a non-doubled, fractional resolution, to overcome, (which is not optimal, looks fuzzier, and wastes cpu).


I can't agree on comparison with 90s by DPI.

What matters for perception is angular resolution, not DPI. And 27" display covers more visual field that 17" from 90s, so you can and should sit further away from it. Once angle of perceived pixel is smaller than angular resolution of your eye, reducing pixel size only adds to the resolutions of shades you can show to the user in that area (closer to bpp increase, than dpi increase, because you can't see pixels anymore, but still can perceive irregularities of brightness on edges).


Yeah, same - 27" 1440p as a daily monitor for work and I have no issues with it. I have had a 27" 4K monitor for a while but it was just too small at 100% scaling, and at 150% scaling some things looked naff. Prefer the 1440p at that resolution.


>I have had a 27" 4K monitor for a while but it was just too small at 100% scaling, and at 150% scaling some things looked naff. Prefer the 1440p at that resolution.

That's what we say too. 27" 4K monitor is too small at 100% scaling, while too small at 50% scaling (pixel-doubling hi-dpi mode).

That's why the idea is to have a 5K at 50% scaling (so everything is pixel-doubled on each axis, and a pixel becomes 4 pixels, doubling the detail you see).


> There are 5K monitors out there, maybe even 8K? Or just get a 4K one but in a smaller size?

The whole point of this thread is people complaining that, outside of LG's fragile Apple collab, there aren't any 5k options widely available.

Go on amazon and search for 5k 27. There's the Apple collab LG UltraFine, and then nothing.

Ditto for 22" 4k, which would provide the same DPI as your laptop screen for that given size.


Wasn't the problem that 5K displays(or maybe it's just this specific one?) are notoriously difficult to make it work on windows? Last time I looked into getting one I found out that it just wouldn't work without getting a thunderbolt card for my AMD based system, or a DP 1.4 compatible gpu.

On the other hand, HDMI 2.1 can now support 8K@60hz, so maybe this is not an issue anymore.


> That's irrelevant

Not to the upthread specific claim that the resolution was “not quite 4K”, which is what the comment you are responding to addressed.

On the bigger issue, I don't really see the complaint. I have pretty good vision (corrected—to 20/15 or so—uncorrected is crap but I'm not coding without glasses/contacts) and honestly my 34” ultrawide at 3440x1440 is excellent for coding, and pretty much any other use. Now, would I prefer whatever resolution a 5K 16:9 would be when extended to 21:9? Or better a 4320p at the same aspect ratio? Sure, more pixels are always better. But does the sub-4K display look like crap or force bad sizes for controls? No.


>Sure, more pixels are always better. But does the sub-4K display look like crap or force bad sizes for controls? No.

Sure, I can work with a 3440x1440 34". Heck, I've worked with CGA monitors back in the day, and black and white (!) SUN Sparkstation monitors.

But, as you said, it's about looking better. "Doesn't look like crap" is a pretty low bar, no? For 2020, and after 10 years of hi-dpi phones and laptops, I expected better from monitor companies...


I have an LG 28" 4K and while definitely isn't as nice as my iMac 27" 5K, it works well enough for coding (I'm primarily concerned about text rendering without visible pixels).


> For a monitor I don't want 4K, I want insivible pixels at viewing distances, so hi-dpi.

That's cool I guess, I was just objecting to calling the 4K "Not quite 4K".


>Windows scaling looks like crap

Huh? I'm using LG's 27 inch 4k and scaling looks good. It can bug and force you to relaunch app but that's not something you encounter often.


I'm using a Dell P2715Q (also 27 inch 4k); it looks fine. But... scaling? The point of having a gigantic 27 inch monitor is that you don't need to scale it. The only problem I do have with the monitor is that it makes me disable scaling on my 15" laptop screen, since there are annoying interactions when you have one screen with scaling active and one without.


> The point of having a gigantic 27 inch monitor is that you don't need to scale it

The point of using a High-DPI display is that you can use scaling without losing the screen real-estate. With 5K @ 27" you can get what looks like 1440p in physical UI element size, but with an increase in clarity, readability, and quality.


27 inch is a small monitor. But I agree, usable real estate comes first, then clarity (by way of scaling). So you want a genuinely large monitor (at least 40 inches) at 8K+.


In my experience, the High-DPI support of Windows 10 is excellent. I am using a 27" 3840x2160 screen set to 150% next to an old 24" 1920x1200 screen at 100%. Pretty much all modern applications seamlessly adapt to the pixel density of the screen they are currently running on without any interpolation.


Take a look at Eizo (pun intended).


* eizo/eizou means "footage" or "image" in Japanese


Didn't know about this company but looking at their history here[1], they used to sell under the brand name Nanao in the US. Nanao made great monitors with consistent high reviews.

[1] https://www.eizo.com/company/information/history/


Recent Siemens MRI scanners come with Eizo displays. I had not heard of them previously, but they do seem pretty nice.


I'd choose an Eizo over LG, definitely.


I think LG makes their own panels but Eizo doesn't? I think it's probably better buying a digital signage display from LG, Samsung or the likes.


I used to have a two-monitor setup with a Dell (? not sure) and an Eizo, both using the exact same panel. I started with one monitor and then got the Eizo. The difference in picture quality and eye comfort was absolutely jaw-dropping. Dell looked and felt like a complete junk in comparison.

Make what you will of this, but it's not just a panel per se, it's also how it's integrated and used in the whole product.


There are a small handful of panel manufacturers, that's true.

Companies like Eizo typically have agreements where they will take the cream-of-the-crop panels though.


It seems like some company there (besides Apple), should seize the opportunity to differentiate themselves on quality, and deliver supply-chain controlled "boutique" hardware, which I'm certain many would shell out for.


There are already companies who do this, such as Eizo.

They cost multiple thousands (or tens of thousands for reference monitors with built-in calibration and Dolby Vision certification).


I'm thinking the op was imagining something in the middle ground. Maybe +10–40% for a guarantee of quality. The case I can think of is Anker for cables although in that case they also don't charge a premium either.


Honestly, in that price range, Dell is probably "good enough"

Grab something like the i1Display Studio to calibrate and it'll be golden for anyone who doesn't need a hardware LUT or built-in calibration.


There’s no money in it and the XDR is a vanity project. Too expensive for the person who isn’t grading marvel movies but not quite hitting the actual specs needed to grade a marvel movie.

Meaning the only people buying it are top tier youtubers


Let me chime in: just bought a T14s. Could be a dream machine but the shite 1080 panel (Windows recommends a hilariously crappy 1.5 scaling. Kidding me?) the pesky trackpad and the awfully glitchy Windows 10 (yah, probably the drivers but the platform enables this horror) destroy the value of this otherwise pretty solid device.


> Windows recommends a hilariously crappy 1.5 scaling.

Windows has multiple GPU-accelerated vector graphics GUI frameworks. Well-written Windows apps look well with non-integer scaling.

> awfully glitchy Windows 10

Here's what you should do with new computers.

https://www.microsoft.com/en-us/software-download/windows10 Make an installation USB drive, boot from that drive, remove all partitions from the laptop's SSD, perform clean install. You don't need product keys to reinstall Windows as long as the SKU matches (i.e. if you have Win10 home edition, reinstall the same edition).

Don't just blindly click through the wizard, read messages and you'll get better UX (you don't want cortana, personalized ads, geolocation, etc).

Connect to internet, run windows update.

Open device manager. If some devices are left in "unknown device" state, you might need to manually find their drivers. Make sure to only install drivers and not user-mode utilities.


What's wrong with the trackpad? I was considering a T15...


It glitches, together with the keyboard. It’s like events start piling up as the UI loop locks up (for up to 1 sec.) then suddenly they rush through and the pointer wanders around and keystrokes fall through at a constant rate. Other times they’re barely lagging enough to feel it.

Awful, but I remember similar issues on Dell and an HP. It’s an issue with the device driver or some “value add” driver control software. :(


This is a really well done hot take. It’s true, unfortunately. Everyone is in a race to the bottom.


Went looking for a monitor recently & was so sad to see that there are less than a dozen monitors with full-array dimming, & most of those have 16 zones or less.

I ended up going with a budget option, no local-dimming. It's frustrating how behind, how stagnant, computer displays are. I don't want to sit in front of a 48 inch OLED tv, too big, not high enough dpi, but I feel like I'm throwing money at bad products trying to buy a computer monitor. At least there are some fair budget options (Pixio PX275h 95Hz 4k, $250, doing ok).


I spent 6 months learning about monitors before buying one for graphic design and the tldr version is: you buy from NEC or EIZO. The panel is one thing, but what sets them apart from the other ones are the electronics inside that drive the panel. And QC, the commodity brands tend to be very hit or miss.


Ahh the LG monitor from the Apple collaboration. It’s so great. So different than other monitors. Stable, reliable, solid. A great product. You can literally see how Apple forced their product tenets on LG.

Sadly, it’s not available anymore. I have one in my office but needed another one for home. Ended up buying an LG with the funny name of something 27UKi6716263 that looked similar on amazon. It’s so different... what a shame

EDIT: it’a not available in Germany


I thought it was the exact same as the LG 24UD58-B except with lightning connectors instead of display port and hdmi? That (the non-lightning model) is the monitor I have, it's the smallest (24") 4k monitor I could find - I wanted high DPI, and I got it.


Unfortunately at 3840 x 2160, it's not ideal since in pixel doubled mode (retina), you're only getting the equivalent of a 1080p display.

The 22" LG Ultrafine used to have a 4069 x 2304 resolution. So in pixel doubling mode you actually got more screen real estate than newer 24" 4k models (which are only 3840 × 2160)!

The problem is better described here: https://www.theverge.com/2019/7/2/20678597/lg-ultrafine-4k-2...


Hmmm. I actually quite like 2160p as a simple upgrade over 1080p; usually I just solve the screen real estate problem by buying more monitors. You can get the 24UD58-B for about $200 used on eBay, so this is not a large cost.


I don't think Apple have dropped it, it's still listed in the UK and here: https://www.apple.com/shop/product/HMUB2LL/A/lg-ultrafine-5k...

I've got one and really like it although it doesn't seem to support HDCP.


Isn’t this the monitor where the first production run wasn’t shielded correctly and interfered with wifi?


Eh, do really consumer want them? Like not pros that do streaming for a living etc, but the people doing conferencing or the occasional capture?

They will hit local, isps or teleconferencing bandwidth cap well before sending all the bits a webcam captures, with subsequent recompression to crap quality.

Why would they bother then?


>The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them.

You think that complete vertical integration will improve product quality?


Apple


Apple is quality in terms of UX and marketing.

The qualities I value however are freedom, security, and privacy.

In that lens, the quality of Apple is very poor.


I’m sorry, but every time I hear variations on “Apple’s greatest quality is marketing”, I know to discard that opinion immediately.


Samsung Odyssey G7 and Samsung Odyssey G9 would like to have a word with you.


If you are looking here to improve your webcam because you started to do remote work, don't.

First try to improve lighting and audio.

Any camera will look bad in bad lighting. It is always better to first improve your lighting before you invest in a new camera. Use high CRI lightbulbs and ensure even source color temperature in your room (ie. all lightbulbs should be emitting same color temperature).

Audio is also very important. It is processed by different parts of brain and we don't put so much attention to it but the quality will influence the other person, subconsciously. Also, audio is how the information is being passed.


Here's a trick how to get beautiful studio-like lighting for less than 50€:

- buy two large sheets of white styrofoam (1m x 0.5m or bigger)

- buy two cheap bright floodlights (the brighter the better)

- get some mounting materials (eg. wood and screws, or cardboard and duct tape)

- Place the two styrofoam plates right and left behind your laptop.

- point the floodlights at the styrofoam, carefully positioning them so they are not in the picture

Enjoy beautiful bright soft light. Optionally add a 3rd floodlight that shines on you from the side/from behind for highlights, and optionally another one to illuminate the background as needed.

Now you'll look amazing even with the cheapest, crappiest webcam.


I am using two Ikea Skurup lamps (https://www.ikea.com/pl/pl/p/skurup-lampa-stolowa-kinkiet-go...) directly above and in front of my monitors, just outside of reach of webcam.

Soft lighting is a result of geometry. You can use very huge source far away from you or smaller but very close.

You also need to pay attention to how the light is scattered and that the direct light is not shining on your face. I have asked my wife to make half a dozen covers for them from a white material. They have a spot in the middle that blocks direct light from the bulb. I use a number of covers on each depending on how much I want to attenuate the lighting.

I also use a very small, cheap 5 watt high CRI, 6400K lightbulb in each. I use same kind of bulbs from same manufacturer for overhead light and for my desk lamp. I have separate desk lamps for 6400K light and 4000K light (which is what I like to use when I work).

The total cost is something on the order of 150 PLN or 40 USD plus a favor to my wife.


4x 5W LED isn't really a lot of light, especially if you are diffusing it. If your webcam can work with that, it's not that bad :)

Edit: I just read your comment again and realized that you are putting the light really close to your face. I guess that's a way to get away with low power lights.


My whole office only has a single 5W LED lamp! I thought that was enough.


I recently installed 5x20W LED lights in my windowless office kitchen for a ficus I got from a friend.

And it's so much nicer now! It's amazing what light can do to a room.

I think most homes should have an order of magnitude more light.


I just measured, the light is about 45cm from my face. Additionally, the lamp reflects most of the light in my direction.

I am not a model and am not used to have such bright light source so close to my face. I had to attenuate it because then it is just too uncomfortable but also to match light level on my face with that of the background.


This is funny advice. Are we still talking about working from home here? Who wants to stare at huge styrofoam panels all day?

Much better alternative - if you have a window, set up your desk so you face towards it. It’s an automatic soft box and you get to look out the window.


If you just want decent video quality for a zoom meeting where everyone shows up in their pyjamas then I agree -- setting up lighting like I described is wasted effort.

But for example, if you are a teacher, or an instructor, or if you have a video meeting with important customers, you may want to put some more effort into presenting yourself. And the best bang for the buck is improving lighting.

Sitting in front of a window is a nice solution for an overcast day, but it doesn't work after sunset (currently around 4PM where I live) or on a sunny day with hard light. Unlike styrofoam boards, it's also hard to control the light because you can't move the window.

I'm not suggesting everyone should do what I said. I was just sharing a possible solution if you want to make good looking video.


> Now you'll look amazing even with the cheapest, crappiest webcam.

I just bought the cheapest, crappiest webcam - a Trust Exis Webcam[1] - and I can assure you that nothing could ever look amazing viewed through that 480p blurry POS no matter what you do.

If you want cheap and crappy, I strongly recommend going for something slightly less cheap and crappy than this one.

[1] https://www.trust.com/en/product/17003-exis-webcam-black-sil...


Why not white bed sheets? I'm sure you could pick up a pack at a mega mart for a few quid. Easier to store, more environmentally friendly and less messy than styrofoam.


Styrofoam is probably easier to mount/angle appropriately for the use and may reflect more light - but the material doesn’t matter a great deal.

If you just want soft light, bounce it off anything you have on hand.


Works fine until it falls of a makeshift stand half way through the meeting, but I guess that’s an issue with yours truly being a bit lazy/not talented at DYI


Buying "pro" entry-level photographic lightning equipment is about the same cost and much simpler to set up.


You don't need "pro" equipment for lighting. That's one area when you can get very creative. Most people can take lamps they already have and put some bit of white material on it, which is what I am doing for a very good effect.


If you already have something then great; but if you're buying it's easier to just get a lightning set and bulbs with a specific color temp than do a lot of trial and errors with bed sheets and whatnot.


To be honest I have no idea what photo and video lights cost now.

When I looked into it a couple of years ago a minimum set of two strobes, mounts and a reflector would have cost more than 300€.


I bought a set 7 years ago for $80; it's cheap not just in price: it's basic; but it works.


Can you recommend a good brand? I tried buying something from Amazon UK but contrary to reviews it turned out to be awful.


Good brands are expensive but I have had some luck with Neewer, a cheap Chinese brand. Stands are usually flimsy but if you can you should buy wall mounting arms for either your walls or your ceiling; it also preserves floor space.


Godox is very good price-performance wise. I am not sure under what brand they sell their video lights in US. I know that their strobes are sold as Flashpoint.


+1 for improving audio quality. As a non-native English speaker, although I listen to daily podcasts without subs just fine, understanding an English speaking--possibly non-native, too!--person having a poor audio quality connection is a different story.


People who do podcasts also take care in their pronunciation and that may be part of the effect.

But yes, in general, I have the same problem. It doesn't matter how well I see you if I can't understand what you are saying.


I wrote an article a few months back about this(1) and post when it's directly relevant. Main takeaways: audio is more important than video; get that correct first.

Easiest thing is using a cheap headset microphone. Advanced noice cancellation algorithms with a great microphone placed far away can't complete with the physics of having a mic close to your month.

Since publishing, I also found Webcam Settings, an app for Mac that lets you adjust specific settings on UVC webcams. I use it to correct the horrible auto white balance on my C920 (naturally looks way too blue).

I just wish Apple would make a new version of their iSight camera. Something with a similar physical shape with a cropped DSLR sensor and a fast lens. Basically, same quality as a mirrorless with a fixed f/2 lens but in a single small package.

1 - https://jonpurdy.com/2020/03/how-to-improve-your-zoomskype-t...


I have a logitech 920, bought about a month before the pandemic, and it's great.. except sometimes it completely forgets how to focus.


Try to install Logitech G Hub, it should help


This is the best advice. A high end mirror less camera will help a bit with its higher dynamic range, but simply optimizing your lighting will do wonders for any camera setup. All of the people showing beautiful mirrorless webcam shots have already optimized their lighting, so start there first.

Also, don’t be the person with the overly-complicated webcam setup who ends up delaying every other meeting while they fiddle with cables, tripods, camera batteries, overheating cameras, cameras going into standby and so on. Having crisp images doesn’t matter if your always late or frazzled because you had to tend to your perfect webcam setup.


Another cheap tip:

On your side monitor(s), set a couple of desktops with plain white, pink, and beige backgrounds. When you're on video, switch to the blank backgrounds to use as a soft fill light.


It seems I needed to include a couple of shots - one is from my laptop webcam, and the other is from my phone. What "bad lighting" is changes very much based on the camera capabilities (lens, sensor, post-processing).


I disagree somewhat. Coming from a photography background, we should distinguish quantity and quality.

Quantity of light you can make up with higher gain (ISO) and higher available dynamic range also helps, as you mentioned.

Quality of light is determined entirely outside the camera, and better gear does not help much if at all. For photography, the quality matters countless times more than quantity, usually. I would argue the same is true for video/webcam.

This is what people don't get, and so hobbyists end up with $5k in gear, taking terrible photos. It's similar to when videographers don't put effort into their sound.

This makes it a question of how and what to light, not how much. The latter would be a matter of adjusting gain in-camera, in Post-Processing, or simply turning existing lighting up. But the former part is the important, and difficult, one.


I agree that we should distinguish quantity and quality.

If we don't have enough light in a scene, we need to raise ISO for a given camera and that can easily make scene look bad. For other camera with bigger and less noisier sensor, brighter lens there can be enough light and it can look very good. I can imagine that there are cases, where increasing amount of light will only improve picture for a cheap camera.

If the light is plentiful and bad, cheap camera can produce abysmal results with parts of the image completely washed white in one part, and very dark in other parts. Good sensor with high dynamic range will produce much better results. Of course, improving the light will improve both results.


I'd agree. I recently picked up a Logitech Brio (employer was paying) and quite happy with it. Main advantage to me is that it can produce a decent picture without me having to faff around with lighting. Of course if I did sort out lighting, it would be better - but (to me) I'd prefer to just swap out the camera than assemble more clutter around me.


Obviously if you directly compare good camera vs bad camera the good camera is going to win.

But if you compare good camera in bad lighting and bad camera with good lighting the bad camera will almost always win unless it is really, really shitty.

It is also usually much cheaper to improve lighting. If your face is a shadow while the rest of the room is brightly lit there is only so much that the camera can do, regardless of how much you invest in it.


There's even a severe difference in different models by the same manufacturer. Apple for example, puts much better webcams in their iPhones and iPads than in their Macbooks.


Yes and no.

A better camera will be able to better compensate for bad lighting, but that doesn’t make the lighting good, and good lighting will improve the results of both. That’s why pro mage capturers (photogs and camera crews) spend a lot of time on getting the lighting as good as possible.


To be sure I don't tell it is worthless to buy a better camera. I just say you may want to improve your lighting first.

Most webcams including the ones in laptop can provide decent quality image if there is ample amount of good light on the subject.


exactly, audio is the most important. Nobody needs to see your unshaven sleepy quarantine face in 4K. But if they can't hear you, or there is a lot of noise, the meeting is ruined


Actually, nobody will ever see your face in 4k because none of the platforms that are used for this will ever transmit video at that resolution. Most cap at 1080p with a very heavy compression.

Also understand that real time compression is different from offline compression. To compress well it requires a huge amount of CPU. Real time compression results in higher bandwidth for the same image quality but more typically less quality for the same bandwidth.


Never say never, I'm sure it won't be long before 4k's as commonplace as 1080p (with heavy compression).


Those are great tips. I'll add a free one: just put your desk in front of a window and use that natural light


I kinda agree and disagree at the same time.

It takes a lot of work to beat natural light coming from your window. If you have restricted budget and can choose the time when you record, this is a very good option that I always recommend to people I talk to.

On the other hand, when you do remote work you don't choose the time. Also, you will probably have other considerations for your desk placement. In the end it is much more important that you feel comfortable when working at your desk than to have good lighting in your webcam for your coworkers.

So, if you like to sit front to the window then sure, go on. But if that would make you uncomfortable then I don't think it is worth it.


Also, coming from California and living on the Pacific, I have sort of a "too much light" problem. I've got sunrise on side of the house and sunset on the other and skylights in every room. I had an optimized setup in the living room but when I moved my office to the spare bedroom I am totally blasted out by the sunrise over the mountains and shadows come across my face as the sun moves.

I'm experimenting with moving my desk but the trial and error is a pita.


Hello from Vermont. How does that help for meetings after, say, 3:30 this time of the year?


I am from Warsaw, Poland and believe it or not, Warsaw is about 7 degrees north from northernmost part of Vermont. Vermont spans 42 to 45 degrees while Warsaw is at 52 degrees.

Did you know that most of people in Canada live at the lattitude of Croatia or south of it?


I lived in .be for 3 years. I believe it :-)

I think my favorite tidbit is that Rome and Boston are at the same latitude.


Also, New York city and Naples, or Washington DC and Cairo.


I have my desk in front of a window, facing the window. I keep the blind about halfway down though, because the glare makes be squint at the monitor uncomfortably.

The other option would be to not face the window, but then you have bright daylight shining onto your monitors instead.


Also a bright window behind you plays badly with some webcams automatic lighting adjust (including the logitech ones recommended here). It should just let the window be overexposed/blown out but it prefers to adjust the brightness such that that is reasonably lit with the effect of making the foreground look like I'm in a dark room.


Sod's law means that the window is often behind the desk, making video pics a silhouette, and the light that does come from the monitor can give some ghastly hues to your skin tone!


--== CRI ==--

As a service to the readers who aren't familiar with the term:

"The Color Rendering Index (CRI) measures the ability of a light source to bring out the true colors of an object. Without a high CRI light source, objects can appear faded, dull or inaccurate. "


Everyone will be happy if you up your audio quality. Cracks, pops and other distortions are in headphones are not fun. Noone really minds visual quality unless you are presenting something on the camera.


Not just subconsciously. If people have a shitty microphone I have more work to do. I need to listen more carefully.


It doesn’t matter, you can have the best lighting and audio ever and still look like a garbled piece of stuttering pixel shit on an end user’s machine because their connection can’t handle the streaming. Unless you’re a cam girl don’t even bother with a sophisticated webcam setup.


I've also investigated this quite deeply. As I'm sure most of us know, the main problems with 'webcams', either the built-in ones or USB ones that you perch on top of your monitor, are the lens size, and the CCD/CMOS image sensor size. Manufacturers spouting specs like '4k' are deliberately misleading the public if the sensor and/or the lens aren't also upgraded.

The only cost-viable methods to get 'broadcast quality' imagery for streaming/recording right now is to buy a second hand DLSR or mirrorless camera that has 'clean' HDMI out that works without the camera auto-shutting down after X amount of time. There are a few sites out there that list the preferable models[1]. I've got a couple of old Canon DSLRs (That don't do clean HDMI) and a load of lenses, so I've been watching Ebay for a newer model Canon DSLR that I can afford. The lowest cost Canon DSLR body I've seen with unrestricted clean HDMI or that can take the Magic Lantern[2] firmware is about £150.

However... You also need a HDMI-to-USB dongle. This converts the cameras HDMI output into a standard USB Webcam input. I've already got an Elgato CamLink[3] (bought for a different reason a couple of years ago), but you can get cheap China knockoffs for about £15, I don't know how good they are though. My Camlink cost WAY more than that, so I have my doubts about the knockoff quality.

Finally, you need good audio capture that importantly is in sync with the picture. The HDMI to USB conversion adds a tiny delay to the image which can put your audio out of sync if you are using a standard USB microphone. Good software like OBS[4] can correct for this though.

---

[1] https://www.elgato.com/en/gaming/cam-link/camera-check

[2] https://magiclantern.fm/

[3] https://www.elgato.com/en/gaming/cam-link-4k

[4] https://obsproject.com/


Finally made an account, just to reply to this. With newer Canon DSLR bodies you can use the Canon Webcam Utility[1] to directly get the video feed show up as a webcam, with just a USB cable.

I've been using the EOS R as my webcam with the RF 35mm F1.8 lens and it's working pretty great. I just hook it up with the USB-C cable and it shows up as a webcam.

The only annoyance that I haven't been able to get around yet is switching batteries. There's a Kickstarter project[2] for a battery that is hot-swappable, but among those features it also allows you to just use the power while plugged in, which I'm eager to try out.

I haven't tested the audio quality of the camera, so I can't say much about that. I've always just used my headset. I would expect it to be somewhere between meh and ok-ish. Of course you could invest in a proper mic plugged into the Camera[3].

I have it set up on a tripod behind my desk. One thing I have been considering is getting some kind of a monitor arm style setup for the Camera, but so far I haven't found such a product.

The setup of course is quite expensive depending on the Camera, but I already had the camera and tripod since I do photography as a hobby, so I was pleasantly surprised when Canon came out with the webcam utility.

Is it worth it? Probably not if you just want something that works all the time. I mostly use the camera to take photos, so I have to re-mount it every time I come back from shooting, and keep the batteries charged, etc. but people do notice and it's fun to see their reactions and getting accused of being a YouTuber every once in a while.

[1] https://www.usa.canon.com/internet/portal/us/home/support/se...

[2] https://www.kickstarter.com/projects/x-tra/the-camera-batter...

[3] https://www.makeuseof.com/tag/best-shotgun-mic/


I know a couple people who are doing the DSLR or mirrorless to HDMI->USB thing. The results are indeed nice. But, to be honest, I have a Logitech 920 webcam and some decent lighting and that handles even making video recordings pretty well. "Can't buy" seems like a stretch.

And for anything I'm streaming I'm frankly more likely to have issues because my Internet upload sputters than anything to do with the camera. And with the Logitech webcam attached to my monitor, the whole thing "just works." If I had a newer model of Canon, I might have tried it but as it is I'd need to buy a converter.


If you're using it powered all the time, have you checked out the existing plug in "fake battery" options? Here's one example: https://www.amazon.com/Glorich-Replacement-Adapter-Cameras-F...


What resolution does it produce? My Sony α6100 yields only 1024×576, which is not great, though still better than any laptop webcam at 1920×1080, or mostly even a phone or tablet front-facing camera at 1920×1080.

(The Sony α6100 is also quite happy to run on USB power indefinitely, no fancy battery arrangement needed. I’ve used it thus for multi-hour webcam sessions, and plugged into a wall charger for multi-hour recordings, where its battery would otherwise be depleted after about 100 minutes.)

Another thing to be aware of when using fancy cameras like this is the latency: you’ll get added latency of 100–400ms, which is easily into the disconcerting zone if audio and video are out of sync by that much, so you may need to do things like add a corresponding delay on the audio, if that’s connected to the computer directly (which will give much lower latency). OBS Studio can do this. I don’t yet have an HDMI capture card, so I’m not certain about it, but the impression I’ve received is that latency will be much lower with a decent capture card than the USB/PTP approach, though still probably higher than your webcam.


I'm not actually sure what resolution gets passed through the USB cable. I would assume it's either native resolution or at least 1080. But to be fair I'm mostly recording in 720p 50fps, since the 4k is cropped with the EOS R and I felt more fps is better for webcam footage (the other modes are capped at 30fps (25?) or have a crop). Besides, people don't need to see my face in more detail.

I haven't seen any noticeable delay in the footage so far, but now that you mention it, I'll definitely keep an eye on that.

The EOS R can draw some power from USB, but it doesn't seem to be nearly enough, or it doesn't work while recording or something.


Latency through a cheap chinese capture card is like a 12-17 ms for me.


To mount a camera on my desk, I'm about to purchase this "Neewer Tabletop Light Stand", and a mini ball head with standard 1/4 inch screw for angle adjustment.

I can't vouch for it yet, but hope it arrives and does the job. There's other similar products in other sizes by other brands.

[1] https://www.aliexpress.com/item/1005001657723138.html


You can buy a dummy battery with a/c adapter.

Sony also updated their software in august and now support using their cameras as webcams without a capture card.


Fuji does something similar with their cameras. I can use their webcam software and get clean video output when I connect my Xt3 via usb to my pc or laptop. No need to purchase extra unnecessary dongles.


I was quite excited to try this when Fujifilm first announced it, but then immediately dissapointed when I found out it does not support the x-t30.


Yeah, I ended up trading in my old XT1 and picked up a used XT3 at a great price which helped. It's frustrating that not all models are supported though.


Have you try to use the max aperture, and let other enjoy the bokeh? ;-)


Great comment. Another few things to add to this if you are going for more broadcast quality rather than just Zoom calls:

- The lens you use if crucial. A nice and fast prime portrait lens (20-50mm f1.4 or something) will make a huge difference with indoor light and give you nice bokeh/blurred background.

- Good quality lighting is essential. El Gato have a good, but again expensive, key light you can use as your primary source of light, but you might need side-lights and back-lights too.


I use a similar setup with a Panasonic GH5 I already had, but I bought one of those €15 Chinese HDMI capture USB UVC cards [1]. These work surprisingly well, and I've had zero issues using them for ~4 hours back-to-back Zoom meetings on my MBP. They've also received some good reviews on YouTube [2].

For audio, I use my Airpods. Great setup for meetings. Latency is good enough IMO.

--

[1]: https://www.amazon.nl/gp/product/B088ZTK56F

[2]: https://www.youtube.com/watch?v=daS5RHVAl2U


Panasonic released a (beta) webcam software this year. It supports the GH5 and works well. Give it a try, it's free.

https://www.panasonic.com/global/consumer/lumix/lumix_webcam...

It's available for Windows and macOS. The Mac version seems to have more issues with various apps than the Windows version at the moment.


> you can get cheap china-knockoffs for about £12, I don't know how good they are.

They are so-so. Well, they are much-much better than any webcam, of course, but 1080p is almost the same as 720p on them. So as a budget thingie it'll work, but Elgato's adapter is much better, of course.


It's converting a digital signal (HDMI) to another digital signal. Why should there be any quality loss even if it's a cheap chinese model?


Well, because they don’t have enough processing power I guess? IDK, it’s hot all the time and image quality is lower than what I can get from a camera SD card.


>So as a budget thingie it'll work, but Elgato's adapter is much better, of course.

I saw a video where Elgato Camlink opened up, is just a chinese card repackaged with the Elgato logo, 99% similar internally to the cheap knockoffs.

Can't find it again to link it to...


https://wiki.apertus.org/index.php/Elgato_CAM_LINK_4K

8 layer pcb with custom fpga hardware, doesnt look all that cheap chinese knockoff to me


You're right, I watched this youtube video on an expensive brand name item vs several chinese knockoffs and they had the same internals -- but it probably wasn't the Camlink then.

Can't find the video atm, or remember what product it did show, but it was posted in the last 2-3 months and it concerned a similarly on-demand item -- and the "repackaging" brand was respected by creators...


I hooked a gopro this way. Their new drivers support using them as Webcams on macos and windows.

Beer Webcam I’ve ever had.


Is this related to newer firmware or so? Because I have a gopro and the usb stream quality is only very low, compared to the HD recording facilities of these devices. If you browse the web for this, you will find (again, as in the threads here) the recommendation to use a HDMI-to-USB converter to capture the high quality HDMI stream a gopro can emit.


> Beer Webcam I’ve ever had. Sounds like you have had quite a couple...


these are trying times, ive had a couple web cans myself


Canon has webcam software for their DSLR's, although I couldn't get 1080p output out of my 750D. Quality was great, though.

You can also just use your phone's main camera, the quality is up there with rather expensive webcams (even for midrange phones these days), add some better lighting and you're set.


I have a 7$ Chinese knockoff HDMI-USB dongle, it works perfectly well even in Linux. Quality is quite good.


Just to add my own experience, I have a friend who had the Elgato Link and it died. He decided to give a $15 amazon one a try and he said it works "just as good" as the Elgato one. So I then bought my own $15 one (having never used Elgato before) and I can also say that it does the job.

I don't know how to further qualify it, you expect it to convert HDMI into USB and it does it. The resulting quality is amazing. So it performs its function.


The cheap USB-HDMI is perfectly viable for using it as a webcam. Qualitywise you won't notice any difference through a highly compressed Zoom/Teams/Skype video. They can output 1080p30 or 720p60, which is again mostly irrelevant in the webcam situations.


I got the CamLink 4k, for use with a 4K camera (Sony AX33). Unfortunately, the HDMI out is only 1080P. So yeah while it works it doesn't feel like much of an upgrade over my aging Logitech C910.


> The only cost-viable methods to get 'broadcast quality' imagery for streaming/recording right now is to buy a second hand DLSR or mirrorless camera

Erm, no. Camo Studio with iPhone XS I am now using has DSLR-awesome picture, above any webcam on the market. And iPhone's BPM for camera sensor and ISP is probably in 10s of dollars. I want to buy this from kickstarter yesterday :).


Any idea if the EOS M is supported for this? Last I checked Magic Lantern had no webcam type functionality but that was ages ago. If it does that's a pretty great webcam for like £100

RE those chinese knockoff dongles, I got one for £8 off ebay, use it for Pi Zero stuff occasionally and the lag isn't bad at all tbh. I'd guess somewhere between 250ms and 500ms but it maybe gets worse at larger resolutions.


I have an EOS M and it works ok. I had to install a different version of Magic Lantern to get it to not auto shut off, and generally a lot of poking around in the menus. You can get battery replacement AC adapters quite cheaply and my cheap HDMI->USB works fairly well. Lag is not a problem but it introduces black bars on the left and right and in Google Meet my feed is horizontally squashed. Other video apps the picture is fine though.


> Manufacturers spouting specs like '4k' are deliberately misleading the public if the sensor and/or the lens aren't also upgraded.

I vaguely remember reading about a video card that had more RAM than it was able to address.


You may be referring to the GeForce GTX 970. It had reduced bandwidth to parts of its RAM.


Maybe the RX480? Many of the 4GB models sold were actually 8GB models with firmware locking them down.

https://www.guru3d.com/news-story/amd-radeon-rx-480-4gb-to-8...


> Many of the 4GB models sold were actually 8GB models with firmware locking them down.

Definitely not that; the concept was that you had a large amount of RAM you could advertise on the box, but the card was unable to use the RAM. You're describing the opposite, the card has a large amount of RAM that is advertised as a smaller amount.


I have a similar setup but don't you find the latency to be bad? At least it's not good enough for video calls IMO. It's fine for streaming to an audience or recording.


With random chinese knock-off and GoPro 3, I had a latency of 10-11 frames when doing 60 fps stream.

It was usable for video calls, but I had problems with reliability (old battery in Gopro, and even if it was permanently connected to power source, it charged only when I turned it on -- which usually was when I wanted to have a call).


I tried to use the Elgato Camlink with my Macbook and Sony ala6400. But everytime I use it, the fans would ramp to a noisy level.

Turns out that processing is still being done by the Macbook.


I have the Elgato 4K and the Sony a6400 and my computer (NUC8 running Debian) does not overheat.


Perhaps the NUC handles thermals a lot better than the laptop. I consistently get around 4-5k RPM when connecting with Elgato Camlink.


Rumour has it that the imitation 'Camlink' products overheat and cut out. Probably solvable with the addition of a heatsink if true.


Well I'm cusrious if you're $2k DSLR gives you a better image than a 150$ webcam on zoom, facebook, skype etc... ?


Have you considered machine vision camera in your investigation?


The real reason is buried in the article:

You can’t buy a good webcam because the number of people willing to pay a lot of money for a high end webcam is very small.

At the lower end, people are satisfied with their built in laptop parts or a cheap webcam that sits on top of their monitor.

At the high end, people go down the rabbit hole of buying a do-everything mirrorless camera that they can use for so much more than just a webcam.

A high end webcam would have to be cheap enough that the first group doesn’t mind spending a bit more, but not so expensive that the enthusiast target audience just decides to buy a full-featured mirror less camera instead.

Granted, there is a lot of room for improvement in that budgetary middle ground, but how many people actually care? Common webcams actually perform decently when given proper lighting conditions. We’re not streaming high-bitrate 1080p H.265 on our 5-person Zoom calls. After compression and denoising the extra sharpness and low noise of a high end camera doesn’t add much benefit.

Enthusiasts are a difficult group to market to because they have extremely high expectations. They’d also rather spend weeks scouring the Internet for the perfect deal on a used mirrorless camera than to spend a dollar more than necessary to buy a high-end webcam.


I'm just not sure I buy this...

The number of people willing to pay significant money for a high end webcam was very small. Now my parents and their friends are talking about better lighting and camera angles.

It seems like this industry is unusually ripe for a "better webcam" that "just works"


I worked as a freelance DOP and worked with cameras + lenses that coat more than a decent car.

And while there is certainly room for improvement with typical webcams: the problem is in many cases not the camera, but the conditions under which it operates. Low light, shooting against the light, smeared laptop lenses, weird angles (not eye level), weird perspective framing, bad combination of light and framerates, mixed light temperatures, low CRI lighting, etc. If you take an 40k€ Arri and a 20k€ Zeiss Prime and do all of the above the result will still look more or less crappy.

Making a good looking image that feels natural is work, and while the camera is an important cog in the machine, it alone won't do wonders. The whole physical space around the motive needs to be arranged the right way, light fixtures that can cost a ton as well are set up, a whole truckload of grip is placed etc.

IMO we will get photorealistic realtime avatars with cinematic lighting before we will get cameras that create better pictures on their own. Or the crappy webcam pictures are filtered in ways that make them look acceptable etc.

Making a good picture involves realising how somebody looks and how they want to be seen and get them closer to that goal, it is not something where one size fits all.


> And while there is certainly room for improvement with typical webcams: the problem is in many cases not the camera, but the conditions under which it operates. Low light, shooting against the light, smeared laptop lenses, weird angles (not eye level), weird perspective framing, bad combination of light and framerates, mixed light temperatures, low CRI lighting, etc. If you take an 40k€ Arri and a 20k€ Zeiss Prime and do all of the above the result will still look more or less crappy.

All this is true, yet the difference between a bad laptop webcam and a high-end phone front camera is huge.


The problem is laptop lids are a lot thinner than smartphones and just don't have the depth to contain a decent optic. Apple have tried to mitigate this a bit in the new M1 machines using some computational photography to improve the image.

The Surface machines from Microsoft actually stand out in this regard. Because the brains of the machine are in the same section as the screen and camera, they are a lot thicker and can put in pretty decent camera modules.


But why can't I buy an external webcam, where that isn't an issue, with the same quality as a smartphone camera?


The market for dedicated external webcams is a little sparse because most people don't want to have more peripherals.

There's probably a good opportunity for home office external displays to incorporate smartphone cameras, studio mikes, and maybe even lighting elements to help people look good while working from home.


Since COVID the webcam market has exploded. Logitech had to significantly ramp up production due to increase in demand back in April. It’s not just remote work, but also aspiring content creators.


> Since COVID the webcam market has exploded.

companies that are crushed by demand for existing products, and dealing with covid-related supply chain issues, and long-term uncertainty about demand aren't going to have a lot of luck getting a new high-end product out in <9 months.

among other reasons, getting these sorts of things made usually requires some travel by the engineers to the manufacturing/assembly facility to sort out problems.


Sounds like a good opportunity for a Chinese company to do it, as they’re fully open.


and yet, they will not.


Wouldn’t a smartphone work for you then ?

it might need a mount to be properly positionned, but that would be the only IRL hurdle. On the software front I don’t know how good the current options are, but fixing bugs should be doable.


I tried going this route a few weeks ago. There's a few pretty big problems with current phone-based solutions:

- You need to fiddle with your phone (which is probably mounted to a monitor?) any time you want to turn on your webcam

- There's a perceptible lag on the final video, not ideal

- Phone needs an external power source, charging-over-USB usually isn't enough to power a phone with an always-on camera

- The phone will get hot, as it's not designed to run camera constantly

- For Android, your best option is to stream video over USB, which means enabling ADB and developer settings, which is inherently insecure. It also makes the charging problem above trickier, as phones usually only have the one port.

Honestly the best phone-based solution right now is to join the meeting twice - once on your personal phone for sending video, and once on your computer for sending/receiving audio/screenshares.


It probably would, and there are apps to do exactly this. Mounts are cheap enough to be reasonable, though there aren't many specialized for this yet.

Personally I don't think they'll even take half of this market - it means you can't fiddle with your phone while on a call.


My laptop lid is as thick as my smartphone. I think there is enough space. I think the technology from smart phones has not swept over to laptop lids yet.


What if the Smartphone camera cost $20, and the laptop camera cost $5 to manufacturer? Smart phones can maybe justify the extra $15, laptop makers cannot. If a laptop maker can't raise the price $15, it literally eats their profits.


Seriously? With laptops costing $1000+ I think they can afford to put in the $15 smartphone part from $300 smartphones in the higher end ones.


If they can't justify the price increase and it doesn't make then a cpear winner against the competition, the manufacturer will cut some corners to stay competitive. At scale, an expense increase of $15 per device is quite significant.


Generally agree but even Apple MacBooks stuck at 720p webcam, why? Apple products should be able to pay additional cost for such parts.


They are able to, but being able to shave a couple of cents on it gives them a massive return while most consumers don't complain, and their competitors aren't offering anything better.


I believe they make 3-5% margin for a $30-$50 profit on the hardware in that example. They literally cannot afford that extra $15 unless they raise the price $15 and then they have to worry about whether a value conscious buyer buys their slightly cheaper alternative.

Keeping in mind this is for a feature that most people actually don't care about.


Seriously.

Take that $15 and apply to 8 other things. The memory. The speaker. The keyboard. Your $500 laptop is now $600. Your competitor is still $500. You lose.


We're not talking about a budget $500 laptop. My laptop cost me over $1500 and its camera sucks. I did not shop based on price; I shopped based on specs and features.


$10 saved is $10 saved. That's how you increase margins, right?


That is why I said high end laptop...


Or, you know, just sell one as a USB peripheral.


If you have a decent thickness laptop lid and still have a crappy built in webcam, that’s just straight up nickel and dimming by the manufacturer.


I mentioned this before, but why not have a camera bump on the outside of laptops? Almost all modern smartphones have a camera bump.

I Skype with family a couple of times a week. I'd gladly pay another $100-200 for a better integrated webcam.


People walk around taking pictures and video with their phone. No one walks around with their laptop open taking pictures or filming the kids on the tobogganing hill.

Now with everyone working remotely I could see a quality demand happening but no where near the interest in phone cams.


Good point. Speaking of phone cameras...

I've had success using my iPhone for video capture via OBS. The setup was a bit fiddly, but still only took an hour-ish to try this approach by downloading and installing OBS, figuring out I needed a virtualcam plugin, finding it and fiddling w config, creating a dirt-simple "scene", and enabling it for use in Zoom. This is on an iPhone X (iOS 14.2), and Catalina (macOS 10.15.7).


A lot of work has been done on phone cameras to replace standalone cameras, but isn't half of the development investment there on the software side -- in addition to the hardware of the camera? (e.g. echoing GP's "crappy webcam pictures are filtered in ways that make them look acceptable etc" statement)


Freelance videographer and cam-op here. Your Arri experience is skewing your perspective. Any recent highly regarded DSLR will be significantly better in low light, intelligently apply both sharpening and noise reduction, and do so with auto focus in a package that weighs less than your Zeiss Prime. I AC for an Alexa Mini LF shooter, and it's interested to see just how far divergent evolution has got in this regard. I wouldn't want to shoot a movie on a DSLR - but they (and of course cell phone camera & smart processing) are significantly better for many applications that the best cinema cameras in the world. Speaking to the original article - you'll absolutely smash standard laptop, or webcam quality, evening in god awful lighting conditions on say a GH5 or A7siii .


I'm a colorist and have to clean up footage from everything from cellphones (a lot recently with the pandemic), gopros, and dslrs to alexas, phantoms, 35mm, and even IMAX footage. Trust me when I say that I've seen it all.

Short answer is you're wrong, even the latest dslrs are miles and miles from those high end solution in nearly all scenarios (discounting ultra ISO lightless shooting that no one does for anything beyond Vimeo demo reels). DSLR footage looks similar to the untrained eye but falls apart in even moderately difficult situations even when shot with off camera recorders costing thousands to clean video videos, which invalidates your point anyway.

The same is true for the mid tier solutions like blackmagic pocket or ursa. I can get the footage to cut seamlessly with an Alexa, but it's much more work to make it look good and has deal breaker technical issues in many more pressing scenarios.


If you could elaborate on that you'd definitely have a interested audience - of at least one


Two


I'm going to have to argue my point here... We're talking about different things. If you're talking about footage flexibility and colour fidelity - absolutely you'll get much better results in good lighting on an Arri or DSMC2 RED....

For literally anything else you might be concerned about - i.e.: what most people will care about when shooting, the image out of a DSLR is infinitely better. I'm not addressing better to grade, since obviously it's both compressed and stored at much lower bit depth and file size (these are all good things in a home context). I'm talking about sharpness, autofocus, and low light performance (including colour accuracy in extreme low light). These are inarguably better on DSLR. For heavens sake you can get reasonable footage out of an A7Siii at 12,800 ISO.

You're looking through the wrong end of the telescope here - i.e.: accuracy and manipulability in a situation with a post budget and crew. I'm speaking to appropriateness for real time streaming and home recording with zero post. You literally can't get HDMI out of an ARRI. It's completely inappropriate for this purpose.

So to make a crude example - even if you could set up the Arri for this purpose - streaming in Fuji Eterna or Sony S-Cinetone from home will look infinitely better than unprocessed Rec.9 from the Arri, for 99% of purposes.

We're talking about completely different things. You're like an F1 driver mocking the suspension and ABS on a family saloon. No one should ever drive your F1 on the road, whatever its obvious benefits for the specific situations it's perfect in. For one thing it costs multiple times the price of the best possible family car. For another it requires a trained crew. And so on...


There are non-trivial workflow issues, but The Possession of Hannah Grace was shot on a Sony A7 SII for example. Things have definitely moved in the last couple of years.


Not OP but sitting the presenter in front of a north facing window, positioning the laptop camera at eye level and giving them a well adjusted $70 mic provides an improvement over any badly positioned camera in mixed color temperature silhouetted lighting situation.


Why north facing window? I have a choice of southwest or northeast facing windows to put my office- which should I pick?


The one that does not get direct sunlight. North guarantees a diffused and somewhat constant illumination.


I think "AC" here is a verbification of "assistant camera" [1], a role in professional (movie) photography.

[1] https://en.m.wikipedia.org/wiki/Focus_puller#:~:text=A%20foc....


> Making a good picture involves realising how somebody looks and how they want to be seen and get them closer to that goal, it is not something where one size fits all.

Incidentally the same is true for audio: You can get what easily passes as "studio level" audio quality out of a 20$ microphone and free processing plugins these days – as long as someone or something is making the right operational choices during recording and processing.

Making good choices in absence of right answers is the hard thing about creative work.


All of those things may be true, but when I zoom using my webcam the picture is far worse than when I zoom using my pixel4. Same goes for audio.

I really, really wish I could get the calibre of A/V hardware in my high end phone as a separate device to plug into my computer. I did buy a nice microphone so I'm halfway there, but the webcam problem remains.


There are numerous Android/IOS apps which make your phone either a webcam or network camera.

Some examples are EpoCam, Camo, NDI Camera, etc.


Yes, this is good advice thank you.

The real problem I'm trying to solve is being able to use my phone while running a zoom, though!


Do you have a retired phone in a drawer somewhere?

I recently started taking Zoom calls in my office on my desktop, and didn’t have a webcam at all. I found EpocCam and used it on my iPhone X, but quickly pulled out my wife’s retired Galaxy S7 to use for this purpose.

At this point you can get a used S7 on eBay for ~$60. That’s cheaper than a webcam with similar image quality, and you can use the phone for other projects if that’s your thing.


I got out my pixel1 for exactly this reason. It began to boot, then hung, and now will not boot at all.

I might try to pick up a spare phone for cheap somewhere as you suggest.


Making a good looking image that feels natural is work, and while the camera is an important cog in the machine, it alone won't do wonders.

A agree with your observations. Is photography 101: you need good light and clean gear.

I was one of 16 people on a videoconference yesterday. We all have the same MacBook Pros, so all the same cameras. Some people looked awesome. Some people looked awful.


How do you know some of your coworkers haven't upgraded their setup on personal equipment? Or optimized their lighting setups?


I have worked as DOP in the past and couldn't agree more. What people need is a guide on how to manipulate the scene in front of the camera using what resources they have, plus a list of cheap or free things they can go get to help.

A better camera would be way down the list for me.

Top of the list would be a light panel and some rudimentary adjustable grip for it. Lighting changes, so over powering ambient light a bit with a strong diffuse light source is going to make the cameras life easier


I don't agree. The camera in my laptop is garbage. I have significantly better-quality Zoom calls if I use the front-facing camera on my phone. But that's cumbersome to use, so I generally put up with the bad quality from my laptop.

A better-quality camera for my laptop would turn my Zoom quality from "garbage" to "satisfactory". Yes, if I wanted to go from there to "movie quality" I'd have to invest in better lighting and other things, but that's not important enough to me to go to that bother. A better-quality camera (built into my laptop; I don't want to have yet another peripheral with yet another cable) would be something I'd happily pay for.


I have been bemused and disappointed that with all the high-end talent in Hollywood, there has been such a long-term tolerance for the trope of crappy sound and badly-lit faces that we continue to see on remote feeds from the "homes of the stars." I have come to believe that directors think the audience wants to see the talent are "roughing it".


Actors aren't camera people, or sound or lighting techs. People who do that stuff make good money doing it because it's actually quite difficult to do well.


true, but if a studio "sends out" an actor to promote a new million dollar movie they surely can send some decent equipment and someone to set it up to that stars home? (or at least have a remote session) Seems like a small price to pay. But maybe its true what other comments mentioned and people want or prefer that amateur look.


Sure, if they want to risk people's health so that an actor can look better during an interview, they can definitely send a makeup specialist and a lighting/sound crew and a camera guy to set everything up in the actor's home, and then send them back to take everything down after the interview is over.

But since nobody cares that actor's aren't wearing makeup at home right now, they could just do the safe thing and not do any of that and the actor can do the interview the way they normally appear in real life. Indeed, it may actually be worse PR to have the actor have a professional studio setup in their home, because then people may ask why the studio risked the health of so many people for something so unnecessary.

TV stations did have crew set up home studios for anchors and weatherpeeps, complete with remote links to the studios video in-feed, but the difference is that anchors and weatherpeeps will be on the air almost every day, for hours at a time.



Interesting point - it’s a very similar situation for audio. Most people are fine with very low quality headphones or speakers. You can get a major improvement with desktop powered monitors that cost $50-$200. You can spend much more for higher quality speakers, but they’ll be limited by the acoustic environment. You really have to put more effort into wall treatments, seating position, and speaker placement before further improvements in the electronic signal chain will be noticeable.


I was amazed looking on Amazon you can get ring lights for $20 bucks dedicated for phones, webcams etc. Makes a huge difference.

For me though its the audio. I hate listening to people on speaker so much. Get a headset or at least a good mic, it isn't expensive.


That doesn't ring quite true with me. I mean, I'm sure what you're saying is a big component of the problem, but... I can take a pretty nice video of my cat playing with a toy, just using my phone, without doing any work on lighting or anything. Ditto for taking a front-facing-camera video of myself and then playing it back.

However, even with some minimal effort on lighting (turning off lights behind me, keeping an overhead light and a light somewhat in front of me), my Zoom calls still look like shit (as do most of my co-workers', who are mostly using Macs).

Now, Google has put a lot of work into my phone's camera, work that I doubt Dell has put into the little built-in webcam on my laptop. But I doubt the camera in my phone is a significant part of the phone's BOM cost (though maybe I'm wrong about this). Would my laptop cost that much more if some more attention was paid to its integrated webcam?

And then Zoom is compressing the hell out of the video and is likely destroying quality in several places in the pipeline.

I agree that lighting is important, but I don't think it explains all that much of the issue we're talking about here. The camera hardware sucks, and the software pipeline is optimized for reliable delivery, not quality.


Well, I'm not quite sure I buy that they will look crappy (terrible CRI is an issue, but white balance does work) in low light, because they have larger apertures!

"It's the lens stupid"... is often true for bad images. More pixels (or even larger film) won't gain you that much, but more light will help with focus, contrast, saturation, noise etc.

Really, you could also say, "it's the lighting stupid"... but better lighting is a bit harder to set up than a larger lens. The cost can be moderate to make it really good. On the other hand making it just suck less, it usually pretty cheap. Thus the ring light you see suggested everywhere.

What amazes me is how good our eye is so that we don't notice terrible lighting.


There are laptops with truly terrible cameras, in particular Dell, even in their most expensive laptops. These aren't my pictures, but I've tried different lighting but the pictures have the wrong colour, are low resolution and are extremely noisy: https://www.dell.com/community/XPS/XPS-15-9500-camera-webcam...


If you're really going for the best, you need a lot.

But having a camera that takes in more light, everything else equal, can make a big difference all by itself.


What is DOP?


I assume Director of Photography.


Yeah I think they’d sell instantly.

I looked all over for one and it doesn’t exist. The mirrorless camera solution sucks because there’s not an obvious winner there either (and you usually need a capture card/camlink to convert the output from the camera).

The room scale ones like the LG meetup are overpriced and generally awful, the PTZ versions from them are even more expensive, use proprietary connectors for the mics and require a host of unnecessary external boxes.

The Facebook portal TV is a great product for most people, but I just want something like that I can plug into a PC.

There’s a vacuum in this market. The current options are either low quality, overpriced enterprise stuff or dealing with the hassle of a mirror less cam setup.


>and you usually need a capture card/camlink to convert the output from the camera

Early on in the pandemic the major mirrorless camera providers (I believe at least Sony[1], Fujifilm[2], Canon[3], and Nikon[4]) all released software that lets you use any of their relatively recent cameras as webcams with the regular USB transfer/charging cable. A separate capture card is no longer a necessity.

[1] https://imagingedge.sony.net/en-us/ie-webcam.html

[2] https://fujifilm-x.com/global/products/software/x-webcam/

[3] https://www.usa.canon.com/internet/portal/us/home/support/se...

[4] https://downloadcenter.nikonimglib.com/en/products/548/Webca...


As I understand these are all awful in their own way, typically with quality problems and extra latency as well as the extra complexity of the software.

The obvious solution is a camera mode in which it presents itself over USB as a webcam… As far as I know no "real" camera at any price does this.

I solve the problem with a Camlink, but it really should not be necessary.


I use a camlink 4k/nikon d610 and a Movo UM700. Great color and low-light performance, with natural bokeh, far superior to software background blur solutions.

Now, I'm not sure it's a great idea to actually present myself in such find detail ;). But it was fun to set up.


My Sony a6500 has been perfectly fine with my macbook since Sony released the new drivers. I haven't noticed any latency and love getting compliments from zoom peeps on the non-AI blur I get with my 16mm f1.4 lens.


What are the max supported resolution and framerate in this mode?


> The obvious solution is a camera mode in which it presents itself over USB as a webcam… As far as I know no "real" camera at any price does this.

There are! Fujifilm X-A7, Fujifilm X-T200, and Sigma fp are three mirrorless cameras that support UVC. And if you classify action cameras as "real" cameras too, the Insta360 ONE R also supports this.

Note that you may have to update the firmware and put the camera in webcam mode to get this to work.


From what I had read at the time (may be better now) that software was bad and the low quality output from it negated most of the benefits of going with the mirrorless setup in the first place.


That's possible, but as others in the comments have mentioned it may be more people with bad lighting/positioning who just decided it was the camera/USB software that was faulty. Still, maybe it does suck. I never tried any of these programs myself because all of my cameras are just slightly too old to work with them.


The people I learned this from were professional YouTube people who have high quality lighting - it was the software.


The best solution I've found is using a app like ManyCam on both your phone and desktop to transmit the A/V stream to the computer and emulate a webcam. Has a ton of benefits like being able to adjust the picture, filters, switching webcams, being able to play videos for people, screen sharing.


Let's say you have a Mac. What hardware/software do you need other than the tripod to get a good mirrorless camera setup? I notice a lot of pro YouTubers use setups like this, but I'm not a photography guy so I don't know much about this.


You need something like the elgato camlink 4k and a camera that supports clean hdmi out.

That’s it - the software options today are bad.


I personally wouldn't want to start in this market, because by the time anything could reasonably ship the COVID vaccine has been distributed enough and most people won't care that much anymore.


I don't think WFH and remote work are going away though, even if COVID does.


I think the demand for it will tamper down quite a bit although, and there are still many traditional places that aren't tech that are itching to get people back into offices.


Yea, in 2019 this article made sense. If there is ever a year to launch a high end webcam, it was 2020. A ton of professionals with disposable income and very few ways to spend it, sitting in Webex and Zoom calls all day. People are dropping tons of cash for ring lights and Lume Cubes.

These are already serving the Youtuber market, the same way that podcasters enjoy Blue Yetis. Which makes me wonder, what cameras are Youtubers using?


When I looked into it they’re mostly using mirrorless cameras like the Panasonic Lumix GH5 or the Sony a7siii with something like the elgato camlink 4k to get clean output.


The Sony A7SIII sensor is just as good as it gets outside of cinema rental market. But certainly any post 2010 APS-C or bigger Sony should do to reliably differentiate from consumer webcams, all you need is a bit more light. Maybe get a prime lens on top and that's it, if you're willing to deal with a HDMI capture setup. The specs of some 4k drone gimbals are also impressive, also using mostly Sony sensors, those might be the next best thing to motorized gimbal setups with optical zoom lens, like those used in telepresence sytems:

https://news.ycombinator.com/item?id=25511682


Yeah I think you're right.

I got the LG Meetup for my living room at the start of Covid and it's mostly an overpriced disappointment. I also had to get the external mic because the built in one doesn't have the range advertised (they actually advertise two different numbers).

I wanted something I could just plug into my TV PC to use for video chat. The Rally setup has the PTZ camera I wanted, but requires two external boxes which (as far as I can tell) serve no necessary purpose other than bad design.

In hindsight the PTZ rally camera with an unrelated third party mic would probably have been the best bet. I would have thought El Gato would come out with some high quality webcam eventually.


> Which makes me wonder, what cameras are Youtubers using?

iPads are taking over. You can get pretty amazing image quality with a new iPad Air or Pro. You can even get rigs with gimbals, lighting, and external mics for almost any budget.


How fast can the hardware market typically respond to a sudden and potentially short term increase in demand like this?


Depends. Inventing new products probably 1y out. But ramping up production on an existing product to a new category should have happened already.


Right now they're talking about better lighting and camera angles. Will they still be interested in six months? In a year? There's a lot of uncertainty for any prospective manufacturer, they could spend a significant amount of money and time developing a product only to launch it into a world that never wants to do another zoom meeting again.


The analysis is very good on the article until the conclusion, which ignores all the data.

It sets facts that there is no product in the market, there is A: "I do not care about anything, so I will buy a webcam" and B: "i care a little so i will spend $1000 instead of $300 on a phone with a integrated good camera" and C: "i will spend $10k on a DSLR and use as a webcam"

Then it measures the market size for A and ignores B and C, which are the actual market they will sell to!


A modern iSight would be great, especially if it could replace webcam and GoPro


Did somebody say iSight 2?


iSight was such a beautiful-looking product!

It's rather mind-blowing that back then its 640×480-pixel VGA resolution was considered good.


It wasnt just good, it was almost broadcast quality :) This was still a time of CIF 352x288/QVGA 320x240 webcams with interpolated 640x480 modes. "High-End" Logitech QuickCam Pro 3000 produced something like this https://secure40.securewebsession.com/mikeshardware.site.apl...


It's still good enough. 1080p or 4k is totally unnecessary for a Zoom meeting.


and unfortunately Amazon and Facebook's spy-on-you-all-the-time FaceTime call-style communication products are going to dominate, especially with that demographic.


I started looking for a solution to "my webcam is a PoS" problem a few weeks ago. I realized that a good sensor etc. will cost money, so the hardware should be useful outside meetings. I thought I would spend about $500 on a basic DSLR. After some contemplation, I decided that this was an opportunity to take up a new hobby, and now I'm about to buy a Sony A6000 plus some prime lenses. Just an anecdote.


Sure, so who's going to start a webcam company and design a cheap good webcam in less than a year for a market that will be dead again in less than a year?


Might be dead again.

With (1) a virus that's going to keep mutating and require regular vaccine updates, (2) the current vaccines with two separated shots and unpleasant side effects, and (3) people ditching their masks even more now that vaccines are in the picture, I'm guessing there's enough uncertainty that people and companies with the money to buy down risk will still be interested for quite some time.

Though it'll still be competing with better lighting setups and recommendations for people to just videoconference from their cell phones. I wouldn't want to risk my capital on it, even though I'm pretty frustrated with my webcams' low quality.


The rear-facing camera on a recent iPhone (or comparable high-end Android phone) is already going to exceed the quality ceiling for what streaming 2-way video will support in resolution and bitrate, given acceptable lighting and background.

If the front-facing camera on their laptop or tablet doesn't cut it, they should use the rear-facing camera on their phone, and put the rest of their effort into lighting and setup.


I think the real problem that needs solving is eye-contact. That's the camera most people are going to buy, because the lack of eye contact is such an acute problem in video chats.


The problems are:

- most work applications don’t benefit from the quality

- most people use smartphones and tablets as daily drivers

- photo/video enthusiasts already own cameras that can be put into service.


A webcam that "just works" seems like a herculean task if the goal is to have native compatibility with most devices without installing a package - Chromebooks, Apple, *nix, etc. but given how decent cameras are standard on mid-tier phones, you're spot on


Most webcam's I've tried do "just work" with *nix since they use USB UVC. And most linux distributions come with or can easily install the qt v4l2 package which can show you a live feed and let you adjust all the webcam's parameters that are exposed over UVC - for my current Logitech that's the common stuff like resolution, FPS, brightness/contrast/saturation, but also hardware-specific things like exposure (or autoexposure), pan/tilt/zoom, and focus / autofocus.

I'm pretty sure the situation is the same on ChromeOS and Macs. And I've seen that webcams these days also tend to be pretty plug-and-play on Windows.


A camera off a $50 budget phone will probably beat every $100 webcam on the market. So why can't I get that same camera module, minus the phone parts but plus a usb interface, for $100.


When John Gruber did his "zoom" video edition of his podcast the Talk Show (back around WWDC), the video wasn't actually zoom. They used zoom for live interaction, but everybody had an iPhone mounted above their laptop screens for the actual recorded video and audio which was composited in post-production.


I still can’t believe that Apple doesn’t make this super easy.


If they made this a feature they would be admitting that their laptop webcams aren't very good.


It's funny that Zoom disappears in the final product. The audio feed is recorded and re-synced on each source machine with a different software and sounds like the same with the video feed.


You can https://www.e-consystems.com/13mp-autofocus-usb-camera.asp

I bought one of these for using with my lab microscope since the alternatives from thorlabs was 10x more expensive for no reason. It worked fine for a year then it stopped working. Maybe I spillled something on it? Who knows ¯\_(ツ)_/¯


thorlabs cameras, and scientific cameras in general, have very different response functions. useful if you want to make quantitative measurements, accurately represent a B&W scene, etc. I use a BlackFly (FLIR) camera on my scope, but have been looking to do something similar to you, since the camera API is not V4L2, but rather a custom framegrabber SDK.


I just wanted something to capture transmitted light images on a real-longterm setup. Turns out the best method is to wheel a simple tissue culture microscope to a warm room and put a cell phone on the eyepiece :) made some cool movies for sure!


Currently I'm playing with this: https://github.com/bionanoimaging/UC2-GIT which does a lot of that (mainly for educational purposes- I think most scientists should focus on getting grant money to buy professional equipment).


That's a candidate for sure, my efforts slightly predate the 3d printed microscope revolution slightly, and I had to MacGyver my way to prove I can image cell cultures long-term without spending 2-300k (since I was doing wild goose chase hypotheses) and hence this solution.

Are you playing with it in a garage or in an academic / industrial setting?

I'm not yet ready to start but my goal is to try and set up an actually productive garage lab - maybe use some neighbouring univ core facility on occasion but not to set up a lab there. Not a fan of academia in general and hope to not contribute money/ideology towards perpetuating that ponzi scheme!


Garage. The goal is to build a prototype which could then be scaled to a warehouse-sized robotic biological experimental system. But, I have also worked with such things in commercial settings.

I'd say that for most real scientists, it makes more sense to raise the funding to buy a professional scope because a lot of the dumbness is engineered out so scientists can just sit down and be productive.


Yeah I have one of these, the picture is nice with good lighting (I have a black and white one, with an antique 16mm film camera lens). Going to write a OBS plugin for it when I have time.


I'd be curious about this. I think what would make sense is to write a V4L2 driver for it, then any app that uses V4L2 could use it (on Linux). Or, is it far easier to write an OBS plugin than a driver?


Yeah thats another option. I haven't found one, but I do have userspace code that works.


Thorlabs don't make their own cameras by the way, they're white-label IDS Imaging products (at least they used to be).

Most of these [machine vision] cameras will communicate using genicam, so you can use libraries like Aravis and maybe there's a way of getting that to talk to V4L.


$250 is deep into mirrorless camera territory though.


If you by new $250 isn't close to getting you anything decent which also works for streaming as far as I know.

Plus for many cameras you need to capture their video output, i.e. USB is not enough (through that currently changes). So you need a capture chip with low latency or else things are bad for "life" usage. This is often another 60-130€ for something decent if brought anew.

Plus they don't have a monitor clip so you need buy an additional stand.

Sure you can buy used parts but then you also need to compare it with used part prices for webcams which normally (i.e. non Covid19) are also lower.

I mean a Sony a5100 with usable objective currently sells new for ~US$700 but you can probably get it new for that price with capture card and stand if you buy clever and we ignore Covid19 for fairness.

(And even used you are still far above 250€ for any recent decently usable mirror less camera with objective in my experience.)




Or GoPro, which you don't need a USB interface for. Works all the way back to hero 4 apparently.

https://gopro.com/en/us/news/how-to-use-gopro-for-webcam


A GoPro performs horribly in bad light though. And it costs serveral hundred dollars as well.

If you already own one, then it might be an option. But buying a GoPro to get a better webcam doesn't really make sense.


Neither of those things is true. You might have to mess with it a little bit in truly dark rooms, but if you can read comfortably you'll be just fine. And far better than any webcam.


Most likely because most of the difference is in the Qualcomm DSP attached to your phones SoC, not the actual camera module. The mobile phone images contain a lot of postprocessing which the webcams don't do.


Include the DSP then. You have twice the budget of the phone to produce the USB camera.


But you produce in lower quantities so your development cost makes up a larger slice of the pie, as well as other fixed costs deeper into your supply chain.


Phones can already be used as a webcam, e.g droidcam app. No need for special hardware.


DroidCam connected via USB on my Android smartphone makes a great 1080p webcam. The latency is minimal.

It has great linux support and works great with OBS (and its chroma key filter)


Seconded, I wasn't satisfied leaving droidcam streaming over wifi 24x7, password or no password, so I have a shell alias to unlock phone and open droidcam over ADB USB debugging.


It depends on you phone.

I tried it with my phone and latencies made it unusable for video-conference usages. Given that this is my main usage reasonable low latenzies are for me much more important then super good image quality. So when I had to buy one recently I ended up with a ~70€ not super good but not bad webcam. I thing normally I might have bought something like a C920 but prices-return ratio was just madness when I bought (sight, Covid19 madness ;=) )


You can get an IP camera app and just use gstreamer to pipe it to v4l2loopback. Perfectly usable webcam available on /dev/video1.


It would be a hobbyist endeavor -- and look like one -- but a Raspberry Pi plus camera module would hit the mark.



What's one such phone?

I've never seen a <$300 Moto phone match a $100 Logitech webcam.


Can't say I did a ton of research to find such a phone but based on my experience in the past it certainly doable.

https://www.amazon.com/BLU-Studio-GSM-Unlocked-Smartphone/dp...

There is one for $70 on amazon that might be a contender. I'm sure if I searched aliexpress or something I could find something better for even cheaper.


Pixel 2 is quite cheap and will be far superior.


Webcams have been a race to the bottom product for many years now, just like optical drives. All of the quality manufacturers were driven out the market ages ago and all that's left now is the absolute cheapest garbage sold to people who just need to check a box.

You would think there would be a secondary market for all of the fancy phone cameras now being made, but sticking one in a box with a USB connector hanging out the bottom (or Bluetooth) seems like too much of an ask when you can just put an absolute garbage 720p sensor in the same box and save a few bucks.


> We’re not streaming high-bitrate 1080p H.265 on our 5-person Zoom calls. After compression and denoising the extra sharpness and low noise of a high end camera doesn’t add much benefit.

This was my experience exactly after testing different setups. I still use my Lumix GX8 for photography and home videos, but it wasn't even close to worth the trouble of maintaining the setup compared to a ~$100 Logitech option that I leave on top the monitor with good lighting and plug in via usb hub as needed.


This is plainly wrong. I use Sony a5100 (cheapest camera you can hook up as webcam) and in zoom meetings the difference between my image and others (even though some tried to use their phones) is just night and day.

Disclaimer: I am that brother referred in the article. :)


I have to imagine that everyone who says this only tried a cheap lens with a teeny aperture. The narrow depth of field of a good camera+lens combo looks vastly different from a consumer webcam, and survives compression (easily, in fact, since the low frequency data in the background gets quantized less brutally than the high-frequency data of your face). In fact, I think that’s the primary visual cue for the “pro” look. Your GX8 is MFT, so you'd need a pretty wide lens (at least f/2, ideally f/1.4) to appreciate this.


Right on the money.

I’m not sure I get people’s use cases here. Ok, if you’re creating/streaming video, I get that you need a good webcam.

But for your teams’ standups, refinements etc...is video quality really an issue?

Usually I’m in a call with 5-10 others. One will be sharing a screen or we will be collaborating on a whiteboard. Other people’s webcam feeds are stamp-sized somewhere in the periphery of my vision. Most people want to hide their surroundings and will use a lame backdrop or a blur filter.

So we’re talking about spending hundreds of dollars to make your 100x100 pixel face that no one is looking at anyway a bit sharper?


It matters less for internal meetings (maybe improving audio quality could help meetings run more smoothly, but that's just a microphone issue).

If any of your employees talk to external clients, I can absolutely see value in getting them set up with a better webcam and internet connection, and I don't think you'd want every sales rep and partner engineer spending a day futzing around with an enthusiast-grade setup.


Erm, I bought an Avaya HC020 (about $250) back at the beginning of the pandemic. It's quite nice--my images are always better than everyone short of pro streamers. You can flip the image, color balance, etc. all in the webcam with a remote control (this is why I got it--"flip the image" was causing my laptop fans to screech at max RPM for a different webcam)

I've never used it, but if you really need microphone arrays, then probably the Avaya CU360 is probably for you. Bonus: it's standalone Android so you can install all the streaming apps on the device instead of on your computer.

And, why not microphone arrays? Because echo cancelling is a nightmare technology that requires real R&D. Somebody always gets the setup wrong on conference calls. The new macs haven't been out long enough for me to trust that Apple actually solved the problem any better.


I think you're underestimating personal vanity. Plastic surgery and teeth whitening has SPIKED since the pandemic started because people are looking at their own faces on zoom all day in tiny boxes next to their coworkers. People will drop endless cash to make themselves look better on camera.

I frequent the reddit beauty forums and hobbyist non-engineers who have zoom meetings for work are dropping $$ on Diva Lights and video filtering apps etc.

Just today driving to the carwash I passed three strip mall aesthetic places advertising fillers and botox.

I posted a tutorial on how to set up a green screen and diva light and within a week I had three coworkers order them- I joked with DVE Store that I should start getting a commission.


Pro tip: If you want a good webcam, don't search for a webcam, search for USB machine vision cameras and find a good lens to go with it. The Sony Exmor or Starvis series of sensors are great.

Or use a Pi HQ camera which is an IMX477 (an excellent sensor at its price point) and turn it into a USB webcam with a Pi Zero:

https://github.com/geerlingguy/pi-webcam#readme

https://www.youtube.com/watch?v=8fcbP7lEdzY


Finally someone suggesting machine vision cameras! These are awesome devices for many applications and they have become way more accessible in the past few years, both in terms of price and by moving to USB3 instead of GigE interface.

They are still targetting industrial and medical applications in their marketing but it's such an untapped potential. Many manufacturers, huge choice of models.


You can turn your old phone into a webcam. An iPhone X has camera quality that competes with a cheap DSLR in a webcam situation.

Main thing real cameras are better at is bokeh, but you won’t get that in a webcam. And software can passably fake it these days.


How?


On iOS there's an app called Camo. There are probably others.


Camo (https://reincubate.com/camo/) has been absolutely fantastic. I wanted to use my iPhone Xs when on calls and not have to fiddle with with plugging it in every time I needed a video, so I ended up using an old iPhone 6 and it still destroys every webcam. $40/year isn't the cheapest but it works so much better than EpocCam (which I had been using before).


I’d second this. Camo is fantastic and with an armature mounted above my monitor I can get a much more flattering angle with a great picture even on an old 6+ front facing camera.


Camo is fantastic.


“Droid”Cam does a good job of turning my iPhone Xs into a USB webcam for Ubuntu.


iphone + lightning-digital-av-adapter + airmix solo[1] + camlink-4k = webcam

[1]: app that gives nice clean hdmi output, and has decent color/brightness controls


This doesn't make any sense. Mirrorless cameras cost a couple thousand dollars (I have one) and the last thing I am going to wear down/abuse this equipment with, is to jump into daily meeting lol. At the low end, there are decent webcams, like Logitech HD 1080p, but it costs over 100$ and honestly, it's a couple of years behind the current. There don't seem to be webcams of around 100 to 200$ that are state of the art. If you look at mobile phones, you see the difference. Those mobile cameras are like 1/5 or 1/10th of the size a webcam could be, so there is LOTs of room for improvement, even in this price segment.

I am pretty sure we could easily get 4k recording with decent, artificial depth-of-field (LIDAR & all) under decent lighting conditions (which is why this can be done cheap) at a price point of 100 to 200$. Just nobody seems to be doing it.

And yeah we definitely do stream 1080p over meetings and most people are always like "Oh wow, what kind of webcam are you using". That's for my years old logitech... The bitrates are pretty low, even for 4k. You can definitely have a couple of those over almost any current internet connection without issues.

My main gripe is depth-of-field. Add some LIDAR and I am happy. Apple webcam anyone?


I actually did run my Fuji camera as a webcam for a Q&A at work. 35mm f/1.4 with a ring light left me looking downright pretty if I do say so myself.

https://imgur.com/7CxnRLK


I use my Fuji x-t3 with a viltrox 23mm f1.4 regularly as a webcam at work. During the first week in all the meetings there was always someone asking about how did I get that 'cinematic' look.


That's quite a difference, indeed.


The real limitation: Fuji’s software for Mac is not amazing. It works, but it’s not entirely reliable.


Logitech Brio is probably the most "state of the art" webcam you can find (and it's a few years old). Decent quality 4k (or 1080 at buttery FPS) with all manner of gubbins to correct lighting - amount, flicker, hue etc Oh, and does have depth sensing - with separate lens that'll scan your face and log you into windows (or out when you walk off).


> with separate lens that'll scan your face and log you into windows

And hence the problem. Now you have a camera is dependent on custom drivers deeply entwined with your OS that are a security risk. It also isn't likely to have usable drivers five years down the line.


It works like normal webcam if you don't need face recognition login.


Mirrorless cameras cost a couple thousand dollars

No, they don't: https://www.adorama.com/ifjxt200sk.html

And that's a new, relatively expensive model.


"On Backorder Order now, your card will not be charged until it is ready to ship."

Generally, this means "a long time". I've been waiting 6 months for a recommended underwater camera, still nada.


there are £400~£600 mirrorless cameras that would be overkill for most people... to notice any difference you'd need pro-level lighting, and a fantastic internet connection - and then you can justify the high-end 4-digit $$$ camera

I can see the argument that the market for mid-range webcams must be pretty small, as the ~$100 range is ok for most people


I could buy a refurb or used Olympus E-M5 mark ii on ebay for $250, then pick up a lens and be in business.


What meeting software are you using that streams 1080p?


> You can’t buy a good webcam because the number of people willing to pay a lot of money for a high end webcam is very small.

And the subset that is not easily fooled by proclaimed premium versions of the same cheap junk is even smaller.


I don't know, a $400 Apple Cam™ might do the trick.


iCam. :)


And then there's folks at the mid end, who buy an $80 Razer Kiyo purely for the decent video with built-in ring light so you're never backlit, with a separate $99 microphone and $150 audio interface. Buying all the bits separately gets you what this author wants, just not in a single tidy package.


The price for audio stuff seems truly out-of-this-world compared to everything else.

I get microphones have a lot of design/testing that might justify it. But all the interfacing gear seems ridiculous.


Wait till you realize that they actually want to pay more for it because the idea it cost so much makes it sound better to you.


If you have niche requirements then expect to pay some premium for the gear because the development costs are amortized into the unit price. It’s that way for AV gear as well as many other niches.

If the market won’t bear the price then people or companies able or willing to bear the development risk will shrink, or in this case maybe never develop.


I'm not sure I follow, you consider $100 for a microphone that sounds like you're standing next to the person instead of listening to them over the internet expensive?

As for audio interfaces: if you don't want one, don't buy one, get the same mix as USB version instead of XLR, and done.


I have this exact setup except I just bought the Samson Q2U microphone ($100ish with stand) which has USB output (in addition to XLR). I doubt I'd be able to tell the difference.


Also, as someone that recently upgraded their wifi router and runs a ping script to make sure my connection is always good, upgraded my mic for better sound quality, etc only to have zoom video calls with people on an iPhone with shit reception, fixing your end only does so much.


If you're a game streamer or other type of streamer and are willing to fork over the cash, it's probably easier to buy an $800~$1k Sony Handycam and USB-C capture device. The majority of game streamers don't really need that though, although they might fork it over because they already have HDMI capture cards with dual inputs.

But for the most part, most streamers don't care about their little picture in the bottom 1/4 to 1/8 of the screen. Good lighting in the filming room is much more important than the camera anyway.


In the middle it is everyone who has a phone that can take decent video.


I think there is a market for consumers who want a better webcam though. Especially in a post-covid world with more remote workers. I know several people who are unhappy with their laptop webcams for web conferences, but discover that external webcams aren't much better. And these people don't want an expensive mirrorless camera, they just want a webcam with comparable quality to a smartphone camera.


tbh I'm surprised that GoPro hasn't been exploding into this category. Granted, their cameras do a heck of a lot more than just webcams (... which is good for longer-term use), but they're small, cheaper than a mirrorless, wider angle than a normal mirrorless camera lens, and already have a stable market and existing users.


> A high end webcam would have to be cheap enough that the first group doesn’t mind spending a bit more, but not so expensive that the enthusiast target audience just decides to buy a full-featured mirror less camera instead.

This makes it sound like the two requirements are in contradiction but in both you're asking for the price to be lower.


> the number of people willing to pay a lot of money for a high end webcam is very small.

More kids today want to be “youtubers” than astronauts and almost all YouTube work is done off tower PCs.

Might not be enough a market for all brands to make one but definitely think someone’s missing a trick of being the Go Pro of streaming.


To make your list even more dire, it seems hard to compete with something like the Logitech BRIO at $200.


Just upgraded my webcam to a very cheap action cam (60 euros). Huge upgrade compared to the laptop camera, but still pretty bad with low lighting. Using a good action cam like the insta360 might be a good fit for someone looking for a better webcam.


I realized the same situation exists with Mail Boxes. At homedepot/amazon you can get a $20 mailbox or a $40 mailbox which are both shitty. Or you can get a 300-900 mailbox which is very high quality.

I finally found a decent one on wayfair for 60.


Wait can you really use a Canon R5 as a webcam? Could be a business expense.


> We’re not streaming high-bitrate 1080p H.265 on our 5-person Zoom calls.

Right, even on my higher end webcam, on a connection with 200mbps up, Zoom shows my video (set to HD), as 720p15.


And I'd add that there is a middle ground of higher end webcams that people can buy. Which in my experience as someone who would do the camera thing if I had no other option, works fine even for recorded video.


This nails the issue with creating any kind of hardware. I’ve worked for hardware developers for most of my career, and have seen these types of challenges play out.

Frankly, hardware problems are hard. The laws of physics are non-negotiable, and doing what we assume to be simple tasks at scale can be crazy difficult. Solving a problem in a lab is not the same as making it at scale. That’s why you keep reading these stories about major lab discoveries that never seem to actually materialize in product.

Us software developers get used to being able to write some code, or license a library, and the problem is solved. Scale isn’t really an issue for us. If we can do it on our laptop, then we can push out our product to millions, almost overnight.

Hardware is quite a different world. We need to tool up factories, train assemblers, set up suppliers and transportation networks, negotiate dozens of contracts; even for simple projects, resolve regulatory issues (which can be quite intimidating, depending on the industry and market), establish distribution channels and create packaging. Some of these are reflected in the software world, but at a much easier-to-manage level.

Also, with software, failure usually doesn’t cost as much as it does with hardware. It’s a lot easier to pick ourselves up, dust off our lapels, and try again. Iteration is relatively easy.

One of my favorite scenes: https://m.youtube.com/watch?v=YlVDGmjz7eM


This article sort of glosses over the actual cost of designing the hardware. You should be able to get 500 units made working through a factory. Though minimum qty for parts can be as high as 5000, usually factories can find most components in smaller qty for 2-5x the price, especially since they already have relationships with suppliers. At $25-$35 bill of materials for something retailing at $250-$300 you're only looking at maybe $15k capital costs for the electronics and maybe another $15k for the molds (less if you can use one of the cheaper processes). $30k is pretty cheap from a capital expense if you're just talking about testing a market. The bigger cost is going to be all the engineering that goes into a product like this. It would be easy to spend $100k in billable hours to bring something as complicated as a webcam to market. You've got component selection and schematic work, board layout, industrial design work, firmware. If you're lucky you can slap together the equivalent of an arduino with an attached camera for initial prototypes but that'll only get you so far, and can still take weeks of development just to have a proof of concept. Hardware development is not for the faint of heart.

, and if you don't already have expertise in the webcam world, there's a good chance your product isn't going to actually be very good.


$30k is extremely optimistic for tooling costs for someones first hardware project. Maybe if they found a good JDM in China to work with.


> Though minimum qty for parts can be as high as 5000

It can be as high as 100 000. No stock anywhere.

> $30k is pretty cheap from a capital expense if you're just talking about testing a market.

Yep. That's what I thought before contacting SoC vendors.

> It would be easy to spend $100k in billable hours to bring something as complicated as a webcam to market.

I'm not in the USA, engineering hours are much cheaper in my country.


> It can be as high as 100 000. No stock anywhere.

There is always a way to obtain engineering samples/devkits. That's how the products in these 100,000 runs get to be.

> That's what I thought before contacting SoC vendors.

You don't buy from SoC vendors directly unless you're an enormous operation. Farnell, Mouser, DigiKey, Elfa.


> There is always a way to obtain engineering samples/devkits. That's how the products in these 100,000 runs get to be.

Of course there is. But what's the point for me to produce a prototype if I'll have to produce hundreds of thousands of units in production with no way to sell that much?

> You don't buy from SoC vendors directly unless you're an enormous operation. Farnell, Mouser, DigiKey, Elfa.

"Your search returned no results."


There are certainly parts that are unobtainium, that's why you need to have a good relationship with someone who has good relationships with parts distributors. A very important part of component selection is to make sure that you can obtain the parts in the quantity necessary. My general rule, is if I can't source it from digikey, I probably can't make the product, but I tend to design products that do runs in the hundreds, maybe thousands of units, so I don't have the same access to components as someone who's doing 10s of thousands of production runs. That said, you may be surprised what your factory can get it's hands on. And if you don't have a factory, then that might be an important step earlier than you might think. Sadly, I have no experience on how to build a factory relationship, and I'd imagine it's especially hard to find a factory with low expected volume. Yet another thing that makes hardware more expensive than you might think...


Are there any good guides anywhere you'd recommend for hardware startups - and how to reduce this $30K upfront cost? That is a lot of money.


The math I did was pretty basic. BOM*quantity + mold cost. Those are sort of the basic knobs you can fiddle with. Not all molds are equal. I was just talking with someone last week who was going to 3d print one of the housings for a component he's designing, since the quantities are low, but found a molding technique that's low quality but simple, and he said it was maybe an order of magnitude cheaper than anything I was familiar with, so learning about molding processes can allow you to design products with a cheaper upfront cost there. I think he said this mold he was looking at was like on the order of hundreds of dollars, instead of thousands of dollars that I typically expect.

BOM reduction can be tricky. Lowering your BOM makes more sense the larger your production runs, but when selecting components I tend to sort by price, then find the cheapest component that satisfies my needs. Of course, a more expensive component may allow you to skip other related circuitry, giving a cheaper overall build, so diving into datasheets is important.

Quantity is the other thing you have some control over, but lower quantity batches have higher per item costs. Setting up a pick and place for a single board takes the same amount of time as setting one up for a larger run. If your quantities are low enough eventually setup fees are likely to start being a bigger percentage. Quantity also effects BOM costs. You can easily pay 2x as much for a component at low volumes, so you may not actually save as much as you think you will.

I agree with you that 30k isn't cheap. But if you look at things historically, we've finally reached a price point for hardware, that you don't need to be a big business to even think about having consumer quality electronics. Apple started with kits 40 years ago and took investor money pretty early if I recall my history correctly. Today I expect it to be easier to bootstrap a hardware company since there's more infrastructure around bootstrapping. I've seen successful products that won't do a production run until they have a certain amount of preorders. But hardware is just always going to be fundamentally more expensive than software


Search for successful hardware kickstarter postmortems (is it still called post mortem if it succeeded?). I think Bunnie might have done one for novena.


> So there is a market gap between so-so webcams for $100-200 and a full-blown setup with a mirrorless camera...

Don't know where you can buy a readymade one. However, if you don't mind DIY, try our free software project showmewebcam. It uses a Pi and its HQ sensor and some software glue to make a USB webcam [1]. You'll have a wide selection of affordable lenses [2] and cases [3] that people cook up for their personal use. It's so much fun experimenting with them for different use cases.

Last time I commented here, there have been criticisms about the quality of the lens that the Pi foundation offer. We have discovered many other decent alternative lenses that help remedy the quality and distortion issue of the stock lenses. An example of a good accumulation of knowledge as we have more users and people paying more attention is the commonlands lens guide [4].

The software is very actively developed and we have a pretty supportive developers community. We try our best to have good software engineering practices so we can maintain this project in the long run. The software is designed to be modularized. It is easy to understand, build, and improve upon. I have a lot of fun building it - in fact I just finished a 5 hours coding session to address comments on the Pull Requests that I started earlier. I hope eventually it's not just another pi project for fun, the firmware has the potential to make this solution more powerful than the best webcam that money can buy, just like how openwrt is for routers.

I still have yet to record a decent demo video to demonstrate the power of the Picam but there is just too many things and too little time to get it done. Oh well...

1. https://github.com/showmewebcam/showmewebcam

2. https://github.com/showmewebcam/showmewebcam/wiki/Lenses

3. https://github.com/showmewebcam/showmewebcam/wiki/Cases

4. https://commonlands.com/blogs/camera-engineering/raspi-video...


I've been using this for a whole now and really happy. Looking over the repo it seems there has been some software improvements so I will upgrade mine now - saving the settings will be very useful as it's too dark for me by default. Thank you!

Regarding the lens, I'm using the '6mm 3MP Wide Angle Lens for Raspberry Pi HQ Camera' and I have zero problems with it. I'm hardly moving in the way I use it as a simple webcam so I have no problem with staying in focus and I just don't get what a few people are saying about distortion, I must be blind. I looked up one of the recommended alternatives, the 'Fujinon HF9HA-1B 9mm 1.4', I can only find it pre-owned on ebay shipped from China for £70 (+ a load of tax I am sure). I will happily stick with my regular lens.


Happy to hear!

>I have no problem with staying in focus and I just don't get what a few people are saying about distortion, I must be blind.

If you point it to a piece of rectangular paper you'll see the distortion. In the commonlands review link above, Max pointed that out with a picture as well.


Perhaps my head and torso are already slightly distorted and the lens is doing me a corrective favour. Either way, still happy!


For a few weeks now I've been doing something similar, using a Pi + v2 camera module instead of the HQ¹. This gives me an affordable(and hackable, to boot) webcam of a surprisingly decent quality with 60fps at 720p, which most consumer webcams can't do at all, since apparently no one cares about framerates(I have niche reasons for caring).

¹ It's also over a network with uv4l because the v2 720p 60fps mode isn't supported by the uvc gadget stuff. It's a bit of a shame, but I haven't had problems with the network transfer. The biggest issue with all this is that my Zero will occasionally overheat and the video will start freezing up, and I don't have much of a solution besides "don't put it in any kind of case".


Sony recently added support for webcam-over-usb to a lot of their cameras, apparently. And they are known for video quality and good lenses. So that’s $700-800 for a very good camera (new) that also works as webcam.

https://support.d-imaging.sony.co.jp/app/webcam/en/download/


Sadly, the software is quite terrible. Last I checked, it only did 720p. Even an entry-level Sony at that $700-800 price range outclasses what the software can do by a long shot.

So there's no way to make use of high-end Sony gear for mind-blowing webcam quality, since the software is a huge bottle neck. Very disappointing and a missed opportunity (not to mention Sony was months behind Canikon and others with their software release).


Canon's software has the same limitation in my case, people say it's because of the USB 2.0 port on the camera (750D).


Worse than that, actually—720p is 1280×720, but this gets you 1024×576. (I have an α6100.)


So close, but annoyingly my older ICLE-6000 didn't make the cut. Thanks for the information.


Yes, my 5100 (which is basically a more compact and software-limited 6000) also not included, but it seems to be a hardware limitation -- e.g. live view over USB is also ont supported on these cameras, and live view over wifi is rather crappy.


I know they are getting long in the tooth but I wish Sony would do something with the NEX cameras. Latest firmware release was v1.03 released a year or two after they released the camera. Feels like Sony has completely abandoned them.


Nobody updates their old products nowadays. At least they haven’t abandoned the app support for those cameras and continue to expand the APS-C line.


Same for Nikon and Canon, works reasonably well. None of the software supports audio, so you need a different mic (e.g. built into your laptop, external USB or on headphones).


Webcams suck because until recently video calls sucked.

> but his camera can't output a live stream into HDMI

Buy different camera.

Sony a6000/Sigma 1.4 16mm lens plus Camlink looks like this: https://twitter.com/feliciaday/status/1268379532823683072

https://www.twitch.tv/videos/834329198?t=00h09m50s

Pretty much every time new person streams/zooms with Felicia they ask about the setup to replicate it :)


> until recently video calls sucked.

As far as I can tell your answer is the only one that makes sense of all the comments.

Most grandparents would pay $ to be able to remote well with grandkids. Most employees would pay $$ to look professional.

But buying a good webcam wouldn't solve their broadband issues.


>> but his camera can't output a live stream into HDMI

> Buy different camera.

He did :)


When this pandemic fuss was at the beginning in January, I've decided to buy a webcam just in case I got sick and have to work from home. It was Logitech C310 HD[0] for like $40 including delivery. When we became full WFH in april, people in working chats started asking me what kind of that super nice camera do I have.

Our company is issuing MacBooks for workers. Turned out that internal webcam of MacBook is so crappy that even cheap 720p Logitech looks "HD" compared to it. Logi also has quite nice mic.

Another thing was that I made my workplace in the erker, facing windows, so I got plenty of light in the summer. I noticed that I got the best picture in the morning and on the sunset, when light is abundant.

So, the takeout is: get yourself not a great but just a decent webcam, figure out the lighting. And for god's sake, do not buy this clown looking mic to hang in front of you, you don't need it at all.

[0] https://www.logitech.com/en-us/products/webcams/c310-hd-webc...


I have a Logitech C310 from 2012 ($32 back then) and its horrendously terrible. The white balance is almost always way off, people have asked me how I got the sepia tone filter! I've also had issues with the microphone drivers distorting the sound (windows and linux). Maybe they've improved the hardware. I really like the narrow 60deg FOV of the unit compared to my higher end C615 (which has much better video quality).


Im seconding that Logitech has a pretty darn good lineup. I have a c910 and it is quite good, even in fairly low light. I would complain more that webcams don’t generally give you fine grain control over exposure, sharpness and resolution. Logitech did, until they decided to update their software to be more “friendly”. The camera is still solid, but it’s frustrating to see companies do this to themselves.



That's strange. Maybe local high-demand/low-supply peak? It's still $42 in Russia: https://market.yandex.ru/product--veb-kamera-logitech-hd-web...


Yep, Logitech makes good ones. Recently upgraded C920 to StreamCam, both are awesome. The reason for the upgrade, neighbors borrowed my C920 for remote schooling of their kid due to the pandemic.


A year ago, you could easily buy a good webcam for very little. The Logitech C920 is, to my eyes, superior to the MacBook Pro webcam, and offered at least semi-decent audio quality. Pre-pandemic, it could sometimes be had for £25 (~$35).

Post pandemic, however, webcam prices seem to have at least tripled (if not quadrupled). As I type, the C920 is going for £140. I can understand this as a supply/demand thing, but surprised that things haven't started to level out yet.


The Logitech C920 is a design from 9 years ago, with a sensor to match. You're missing out on a lot of sensor improvements over the years that give things like better low light performance - handy when using a camera for video indoors. It's also, unfortunately, close to the best webcam out there. Apple really don't seem to have brought the camera improvements from their iPhone range to their MacBooks, apparently their built-in webcams are as bad as everyone else's.


Agreed. Sorry, I misread the original post -- I thought they were using current MBP cameras (rather than mics) as a benchmark. C920 is miles ahead of MacBook cameras, but as you say, nowhere near iPhone level of quality.


C920 is still only mediocre IMO. It's ok if you sit completely still but it doesn't handle motion at all.


so what, everyone else buys Sony or m4/3 and a capture card. It's not that you need motion in the typical videoconference that much - and even then: what's handling motion about? I don't have a problem with my c900 (or whatever) and the only problem with a very old Quickcam STX seems to be no driver ..


I can't say the same about my experience with (other people's) webcams.

From all the video calls I had this year, people either used

* their notebook's built-in webcam that's complete garbage

* any other webcam, and their image is clear as day, or at least video quality is limited by lighting or bandwidth.


Seconded, to a degree. It seems most people's workspaces are just very dimly lit. Of course, modern hypersensitive sensors from DSLRs are immune to this.

I use a relatively simple and old Logitech (C720 I believe) as webcam, and when I am in calls, I point my desk light straight at my face from just atop the webcam. This alone increases the image quality by an order of magnitude. If I would do more videoconferencing (currently only an hour or so per day, mostly with my own team) I would probably invest in a better light setup and a microphone, before I would think about upgrading my camera. Most small cheapish sensors do absolutely fine, given enough photons to work with.


Oh lightning is definitely the largest real-life factor.

I definitely cut my colleagues some slack here since many literally had to set up a home office out of nowhere.

Some had to go to the basement (and CFL lights + webcams = blergh), others just had to cram themselves into whatever corner of the house (thus also badly lit).


Yeah, and I certainly don't want to hassle anyone who hasn't set up a perfect video studio for calls in their house. On the other hand, after 9 months I do sometimes feel a bit of frustration for the folks who are still perfectly backlit from a window rather than hanging some fabric over it.


I've had compliments about how good my (work) Surface Book 2's camera is. Surface Pro 4 seems pretty good too.


Yeah a co-worker with a Surface Book (not sure the exact model) has by-far the best video quality I've seen on work calls. In general, I actually find built-in laptop cams to look much better than the typical external webcam. They're still not great, but entry-level external webcams are just atrocious.


I use https://www.elgato.com/epoccam on my iPhone and for a few dollar is far superior than every webcam I've tried in my life.


Similarly, and multiplatforms, I recommend DroidCam (https://www.dev47apps.com/). The interface could use some polish but it works well on android/ios phones to Mac/Windows and Linux.


What's your lag with Droidcam?

My experience was a delay of half a second, making it unusable for live conferencing.


I use DroidCamX too, I don't have any noticeable delay but I have a fairly solid wireless setup (Unifi AP in the same room) and most of the time I just use the USB cable anyway so I can also keep the phone charging


I've used it wired via USB to Windows, and it was usable for live conferencing.


Similarly, I've used the 'NeuralCam Live' app that also turns an iphone into a webcam. It's free to use, with extra paid options too. The biggest downside is the lack of microphone support, which seems an odd omission.

I think the article is missing the obvious solution: if you can't buy a standalone webcam with as good image/sound quality as a phone, then use/make software that lets you use the phone hardware, there's no need to try to build your own camera.


I recently switched from Droidcam to Filmiic Pro (~$20), and it's pretty good.

Filmic Pro supports clean HDMI out from the rear facing cameras on iPhones and some Android phones. You do need a capture card, but if you have a Camlink or the no-name $20 USB capture card that EposVox did a positive review on, you're good to go.

On Android (I have a P20 Pro), I use a USB-C HDMI dongle meant for a Nintendo Switch, but it works with my phone plus it lets me charge the phone while using it.

Filmic also has a "remote" app (~$3, iirc) so if you have an extra mobile device, you can control the camera app remotely (since the phone you're using as a camera has its screen is pointing away from you).


That's a cool app! Thanks for mentioning it.

I did not realize that Corsair owns Elgato.


Their key light is also a good way to improve lighting.


Worth noting that this won’t work with FaceTime.


Good way to burn your battery.


The trick to great picture quality is to use good lighting and drop the ISO on the video as low as possible. I've been using Reincubate Camo for my iPhone along with Key Light Air Lights. The quality is better than any webcam. Most webcams crank up ISO so they can work in the dark and that is what ruins picture quality.


This cannot be overstated. While I agree with many of the camera recommendations in this thread the most important factor is enough light.

Even the worst webcams go from absolutely trash to something usable if you add better lighting to you subject.

And when people start thinking over lighting you can move away from your window so you do not have that as a powerful backlight. Look out the window and the webcam will have much better results.


I just want something with some depth of field.


Given my limited understanding of photography that is not really possible with small sensor sizes. Phones can achieve that using computational photography, but I don't think it's possible using optics on webcams.


Yes, optical depth-of-field is only possible with reasonably big sensor sizes and wide-open lens.

Reportedly, Huawei P40 Pro has an IMX700 sensor with the 1/1.33" diagonal. I didn't run the numbers for it, but it is big enough that with a bright lens it will produce bokeh.


Mobile phone cameras do this with a ToF (time of flight) sensor and a software trick. I have yet to find a webcam that can do this.


I think you're 100% correct (minus some weird lab-grade setup or software-base DoF)


What about a Gopro? They are quite sensitive to good light conditions but otherwise lens and resolution is very good, not DSLR grade but lots better than webcam/phone. Fisheye can be disabled. Audio is also decent, maybe not studio-grade indoors but when moving in the wind outdoors it's a lot better than phones or other non-professional gear. Lots of avilable mounting-gadgets and tripods available. Price is just above 300€ for last years 8-black model or sub 200€ for used or older models. Then you can use it for other fun stuff as well.


The article doesn't really go into a lot of depth about which Logitech cameras they looked at. I've got the Brio (https://www.logitech.com/en-gb/product/brio) and it seems pretty good to me.

If you are looking to use an iPhone camera for streaming/Video conferencing camo (https://reincubate.com/camo/) works well.


> The article doesn't really go into a lot of depth about which Logitech cameras they looked at.

All of them. Brio is better, but still has much worse quality than any top smartphone.


Depends what you're looking for I guess. The brio's been fine for me for video conferencing and even the odd conference talk.


Hard to justify the ($200) price. You can literally buy an entire smartphone with two cameras for the cost of a Brio.

It is pretty clear that there's no competition in the webcam market, and that the offerings have stagnated for years.


The Brio's price isn't purely on the camera quality, a decent chunk of that price was the integration with Windows Hello. Now, of course, that doesn't matter if you're not using Windows, but the market it seems designed to cater to , is Windows users.


I think the author lost me when they said that a good webcam has to have "decent sound quality" and uses the MacBook as an example.

I have literally never seen anyone suggest using your webcams /laptop mic except as a last resort.


Recent MacBook Pros (2020 13" M1 and 2019+ 16" Intel based) have incredible microphones for laptop mics [1]. Older 13" Pros and Airs have decent mics too but not as good as dedicated USB microphone.

[1] https://www.youtube.com/watch?t=237&v=CmMOJTs7Pu8


Macbook Pro mic is literally much better than most webcams mics right now.


My point for webcam audio you want to use a headset, USB mic or a real sound card and an XLR mic.

Using a laptop / web cam mic is the fourth/fifth best option - its a bit like playing Association Football (Soccer) in League Two


The premise of this article is nonsense. Get a Logitech Brio and a Blue Yeti boom mic and you'll be so far ahead of the average Zoomer that people will make comments on how good your picture looks and sound sounds. If you want to take it up a notch still, mix it all through OBS and use a stream deck to control screen share transitions. You'll look like a wizard after some practise.


This is what I do (Brio and Blue Yeti USB mic) and I am happy with it. My company locks down laptops so I use no non Windows software but the approved and centrally managed Teams and Zoom. I also have an $80 variable warmth panel light mounted to a desk stand about 8" behind, above and slightly to the side of the web cam. Useful as the lighting changes throughout the day.

I like the mute button directly on the mic.


> So there is a market gap between so-so webcams for $100-200 and a full-blown setup with a mirrorless camera, an external mic and lighting panels that will cost almost a grand or two, if you're so inclined.

Nope. You can get a mirrorless camera set up (with mic and lights) for around $500 to $600. There are plenty of 1" compacts as well in that price range.


Can you mention a specific models that would work well in this use case?


Sony's A6000 works perfectly well, but I recommend you also grab a NP-FW50 dummy battery as it will drain the battery even when plugged in via USB over time.

If you want better sound, grab the A6600 which can also sport the XLR-K2M microphone adapter.

Regarding HDMI capture: the Blackmagic DeckLink card series works just fine in a Thunderbolt case (at work I run a 2x A6000 system + external HDMI input).


> Blackmagic DeckLink

And if you wanna save some cash, any $20-25 hdmi to usb capture card will work just fine. It won't give you uncompressed video, but you won't need it for this use case.


I've been bitten in my ass lots of times by these things. Random crashes, overheating, dodgy connectors that don't tolerate even the slightest movement... thanks but no thanks, the cheapest DeckLink clocks in at 150$. Downside is you'll also need a Thunderbolt PCIe case, but eh. It Just Works (tm).


I am looking for an improvement in convenience over mirrorless, even if it costs the same.

A webcam can be more compact because it does not need half the stuff thats in a mirrorless camera



If you have the guts to switch to Raspberry (or follow this guide: https://www.youtube.com/watch?v=8fcbP7lEdzY) the Raspberry HD camera is really good!


If you have a Canon DSLR camera, you can download their webcam utility to plug it in via USB cable. This is handy because you don't need an HDMI capture card! The only problem I have now is I need an A/C adapter for my camera, as the battery only lasts about 1.5 hours.

I found that using my standard zoom lens is fine, the 50mm portrait lens looks better for pictures but that is overkill for a Zoom meeting! Any DSLR camera will blow a webcam out of the water even in low lighting.


As a note, for older Canon DSLRs that aren't supported you can pick up a cheap capture card on ebay for $15 and an AC adapter for a similar price. Using Magic Lantern you can get clean HDMI out and do any cropping needed in a tool like OBS. Works wonders. So much so I'm told my video is too clear at least once a week.


Absolutly! Also, Webcams are almost not developing. e.g. the Logitech C270 is 10 years old and is still selling for 30-40€, I think almost the same price since 2010.

Imagine the phones or cameras in the year 2010. I really think the webcam industrie is a cartel (not to say mafia).


Owner of a Logitech Brio 4K here. The quality is terrible just like this article mentions and I would have happily spent the money elsewhere if I had options.

I used to have a tiny Canon digital camera in 2003 that had better quality than this.


It's terrible out of the box. With a firmware update and bit of tweaking it's not bad.

Under Linux, you can use guvcview to play with the settings to your liking. Here's an example command line for Zoom purposes:

v4l2-ctl --set-fmt-video=width=1280,height=720,pixelformat=MJPG --set-parm 30 --set-ctrl=contrast=32 --set-ctrl=sharpness=176 --set-ctrl=zoom_absolute=133 --set-ctrl=tilt_absolute=-36000 -d /dev/v4l/by-id/usb-046d_Logitech_BRIO_*-video-index0

For Windows, you can check out the following video:

https://www.youtube.com/watch?v=dwXR27wLhoE

For webcams, the only other decent option is the Avermedia PW513, which only came out a month ago.


I've been trying to find PW513, but it doesn't seem to exist anywhere last time I checked. Is it out for sale already?


PW513 is available on US sites (e.g. amazon.com/bestbuy.com), but not globally yet.

See also my comment here:

https://news.ycombinator.com/item?id=25506629


My Brio is fine if you are not sitting in darknesd or exoect pixel perfect 4k


Brio is retailing at ~$330 in my country.


I don't think it's fair to put high quality sound as a requirement for a webcam.

A webcam sitting 2-3 feet from you is not going to sound good, especially not with a dinky microphone that has to fit in a device that's 3 inches wide.

Even $2,000 cameras have poor sound quality and pretty much everyone who does semi-serious recording (Youtube, etc.) will use an external microphone.

Honestly, the classic Logitech c920 is pretty good and it's $60-90 depending on current demand. You can record at 1080p at 30 fps and as long as you have a decent light source ($20 worth of LED lights) the picture quality is quite good.


Sound is really important (more important than video tbh). Fortunately, good microphones are aplenty, even a cheap Boya PVM1000 for like $50 works great (takes batteries, connects directly to a 3.5mm jack).

Or you can go for a Zoom Hx, those work via USB, great inbuilt mics, too.


Reading this article actually got me sold on the idea of a phone-holder+software in lieu of a good web cam.

I, like many people have an old smartphone lying around.

My Samsung S7 has better camera (front and back) and better mic than any webcam I’ve owned.

Getting a nice holder and a reliable app that could spit webcam picture out of that phone would be great.


I have used phones with obs.ninja to stream the camera output to OBS over the local network and then send that to conference applications with OBS's virtual camera plugin.

It sounds janky (and it kinda is) but compared to some of the client/server app pairs I'd messed with earlier, it works more reliably. One plus is that I don't need to install anything on my phone since it just uses WebRTC in the browser to send the cam/mic feeds to OBS.

Of course, even once you get it set up on a tripod and pointed at your face, you will want to turn down screen brightness and keep it plugged in. It works and it works well, but your phone may get warm and drain some battery if you are using it for hours at a time. In the end I consider it a good backup option (used it when I forgot my webcam at another location) but I prefer the ease of a dedicated camera if I have the option.


I'm not sure that the product the author describes trying to make is really any better than the top-of-the-line Logitech units that sell for $200. I have seen them in action, and unless you're pixel-peeping at 4k, their results are at least on par with a mid-range phone. Without investing $$$ in computational photography to clean up the images, you're not going to get flagship smartphone quality photos out of cellphone-sized assemblies, no matter how high quality they are.

A more interesting take on this space would be to try and offer the same quality as the mirrorless camera setup in a smaller and cheaper package. Upscale to a 1-inch or APS-C chip and a wide aperture lens, and you could get the blurred out background effect naturally, get an overall higher quality image. Without the need for a display and by offloading autofocusing compute to the computer, you'd be able to reduce the per-unit cost substantially below a standalone camera.

Another interesting product for this space would be a teleprompter-style monitor-camera pair that effectively lets you place the camera behind the screen. This allows the user to naturally maintain eye contact with the camera, which can marginally improve connection and trust with the human on the other side of the call. For, say, sales teams that may have to rely more and more on video calls, compared to in-person meetings, this marginal improvement could be worth a lot of money.


I can not understand why I can not use an iPhone, as a webcam for a MacBook without third party tools (eg. epocCam), that require installation of drivers.

To me this is an obvious thing to do since Apple is so proud of its ecosystem. Especially in the time of corona when all the dslr / mirrorless camera manufacturers are releasing software to do just that.


> you also can buy a great XLR microphone for $100, but then you’ll need an external sound card with phantom power supply.

this is only true if you buy a condenser microphone. a dynamic microphone will work reasonably fine with your onboard soundcard if you have an xlr-->1/8" trs cable.


But with a dynamic mic, you will likely need a preamp - the inbuilt ones on laptops and most devices aren't enough.

You can get battery powered condenser mics, though, and output unbalanced XLR to 3.5mm, works fine.


There are plenty of camera modules available.[1][2] Those are both 4K, USB interface and take a big external lens. That's just from a quick look at Google. You could probably find more options, even without running around Huaqiangbei. You're not limited to phone-sized optics.

[1] http://www.camera-module.com/product/8mpcameramodule/cmt-4k-...

[2] https://www.e-consystems.com/4k-camera-module.asp


Honestly, the Logitech c920 is still the 'good enough' option. It's cheap and produces a decent image.

However, the raw video can sometimes be a bit unflattering.

I would recommend using OBS as an intermediary between the webcam and other applications like Zoom or Skype. OBS has a virtual camera feature so that the output looks like just another webcam.

The reason you would do this is because OBS has powerful video filters you can overlay on top of the video feed, so that you can apply color correction and alter the brightness before sending it out. Even small changes can have a large impact on the quality of the video image quality.

I found this much more preferable to messing around with Logitech's unwieldy software.


This is what I do for both work-related (and as of this year, recreational) video conferences. At the start of the pandemic, I only had an older webcam which maxed out at 1280x720, didn't have much in the way of onboard hardware processing capability, and looked generally cruddy. Set up OBS initially so I could use the basic color correction and then moved on to playing around with chromakey backgrounds (sure beats the software chroma in Zoom and Webex).

Once C920 came back in stock (at non-scalper prices) I did pick one up and it was a modest improvement. Overall performance is just better across the board. Only real annoyance is how I have to reconfigure the video options every time I start OBS since the cam likes to revert back to auto white balance/focus/exposure. Still, it takes less than a minute to set it back to where I want it.

For lighting, I have a couple of those clamp-on utility lights that are sold at just about any hardware store. Each one has a sheet of kitchen parchment paper clipped to the housing to act as a basic diffuser. Between whatever desk lamp or sunlight I have coming in, the two clamp lights let me set up a reasonable 3-point lighting and avoid any weird shadows.

For audio, I have an old Tascam USB audio interface with 1/4" and XLR inputs along with headphone monitor out. Hooked up an old Shure vocal mic on a stand and I get quality and a pickup pattern I just couldn't touch with a headset or desktop condenser mic.

A lot of this was only affordable because I had some of the stuff (mic, USB interface) laying around already, but that was sort of the takeaway. Make use of the best stuff you already own, consider what hacky solutions you might use to fill in some gaps, and only then spend more money on a better camera, lights, etc.


A good webcam, in my opinion, is a device that provides a decent sound quality (from the article).

Ehm, sure about that metric? Should a webcam not primarily offer good video? And why not having good audio by means of an external microphone in addition to a decent camera?

If that's the primary metric, the rest of the article is flawed.

IMHO the best setup has a dedicated camera and at least one dedicated microphone (Most people streaming use microphones or headsets, don't they? Same for radio-show/podcast hosts.).

Doesn't mean that a webcam can't have a decent microphone, but if you are going the custom route...


Anyone not using headphones is going to have echoing and feedback anyway. May as well just use a headset.


A high end Logitech with a Zeiss lens and auto focus is superior to any webcam you can buy. It'll also last for a long long time.

Of course a bigger sensor and a bigger lens (DSLR, Mirrorless, etc.) is something different.


They are mediocre at best, owners say. I know some people that have one they use as secondary cameras (for multi-angle setups), but they bought it first, got disappointed and bought something else, this is the path.


I own an old Pro 9000 and it's still respectable. Only its FPS is a little low but, it was the king of the hill when I bought it.

As I've looked now, it seems that they discontinued ones with Zeiss lenses. The ones with Zeiss lenses were both low light and clarity monsters for their size.

Their highest model is Brio Pro with 3D sensing it seems. I need to see its performance. However Logi says it has a real glass lens so it shouldn't be a slouch.

Nevertheless, it's not fair to compare a lentil sized sensor with a Full 35mm, last gen mirrorless.


Logitech Zeiss is a webcam you buy.


You can do wonders with a modern sub-$400 camcorder that can do manual white balance, aperture priority, and provide clean hdmi output. Add a $20 tripod, a capture device, and you’ve got a webcam better than anything sold as such. Camcorders can have some inherent latency in their feeds and that’s an under-examined differentiator in camcorder reviews these days.

As to accessory webcams - there is definitely a market for high end webcams. The Logitech 920s or c or whatever was very highly rated and one of the first to disappear from shelves last spring along with their 4K model. I’m not saying it was objectively great equipment, just using it to illustrate that a webcam with high ratings will sell for a premium no problem. I don’t know about the margins for OEMs but I suspect there’s some room there when a $250 webcam is only marginally better than a $90 version.

I would easily pay $400-$700 for a “usb webcam” if it approached my iPhone camera’s quality and allowed me the control and capability of a cheap camcorder (think what a $250 Vixia comes with - manual white balance, optical zoom, adjustable frame rate, physical powered lens cover, IR remote control, tripod mountable, integrated confidence display - especially if you could leverage the USB connection to make the confidence display function as a teleprompter too). No question there’s a market for such a product. It would sell wildly in today’s market.


You can. I did: a Logitech BRIO 4k. I used it mainly as a document camera to deliver lectures from home. I should now probably avoid using fingerprint reader unlock systems, as you could probably pretty easily get my fingerprints from frames in the video (Mythbusters demonstrated even a printout can work). That will be a good enough webcam for years to come, though I expect my next DSLR upgrade to provide a whole other class of option at some point.

Relying on the mics built into a webcam seems pretty silly, though.


I suddenly needed several reasonably good quality webcams (had to instrument my home lab for teaching electronics remotely, and wanted multiple views in some cases. NDI and OBS plus my pre-existing pile of old iPhones (a largish extended family's worth) have proven an admirable resource, especially since the screen lets me tell at a glance what the frame is. Even set up a jig to use an iPhone with my optical microscope.


I am going tangential a little bit from what the original post here, but I think the answer is you can.

GoPro Hero Action Cams can be used as a good webcam[1], for instances.

They are definitely NOT cheap though - running $250 - $500 new depending on the models and about $100+ for used models on flea bay.

The latest GoPro models like Hero8 Black or Hero9 Black that doesn't have HDMI output ports but instead having USB-C output ports actually make it easier to connect to computer as a premium webcam because not only those new models don't require a HDMI video capture device to convert signals but also the USB connection is slow charging the GoPro battery! All you need to get it to work is the GoPro camera with latest firmware, the bundled USB-C to USB-A cable and a desktop app (for macOS the app is about 10MB and super easy to use.) Let me tell you the video quality so much better! (1080p instead of 720p on MacBook Pro and the camera sensor is obviously different too).

https://gopro.com/en/us/news/how-to-use-gopro-for-webcam


This is a little off-topic, but a little tangential. Why do so many people insist on doing video chats? Regardless of camera quality, you're going to come in as a blocky and stuttery mess to most people anyway.

Audio-only has worked really well for my team for a few years now (my team is all remote all thr time), so it's always jaring when interfacing with the rest of the company and having people pop up with video.


They want to mimic the office as much as possible, that's really all the argument tends to boil down to.

Some will claim it is for charisma --> sorry, no amount of charisma is going to fix a bad cam and bad audio.

Some will claim it is more productive --> I really only need to hear your words or see your screenshare, your face moving in front of my screen is distracting.

Some will claim it is more natural --> Try looking at a whiteboard while everyone stares at you from the front. That's about what happens when you use videochat in important meetings. I don't think that's the way people used to have meetings.

Some will claim it is to read body language --> Don't depend on body language and stop assuming things from audio alone. We're adults. People will tell you when something is up.

The only argument I can't go against is bonding. But if you don't want to bond with your team past business-only, that argument is voided.

And I haven't named the cons yet. One of my pet peeves being people will start calling you over every single thing, make you repeat yourself often, misunderstand or even forget what was said in 10 minutes. You don't have that problem with asynchronous communication, and it also yanks the callee away from their flow.


> Why do so many people insist on doing video chats?

Control. They want to make sure the silent people in the meeting aren't doing something else or not actually present.

It's probably why it's particularly common with managers who run meetings that are neither effectively timeboxed, limited to the essential personnel as required, nor planned well to handle an agenda that requires a meeting rather than an email or exchange of emails to handle effectively.


www.kurokesu.com is quite interesting.

Sells sensors or kits to convert existing Logitech cameras, and filters/lenses you can add to them.

It's niche - but sits between an off the shelf webcam, and lashing a 'proper' camera to a USB socket.


I ended up using my Micro FourThirds mirrorless camera.

At first I used a Panasonic DMC-GH3. There is no official way to stream the image to a PC so I ended up using a HDMI capture box (costs around $50). The image quality with a nice prime lens is fantastic. You get a blurry background and some nice bokeh, something no Logitech camera can give you and it looks great. The camera and the capture box support up to FullHD.

I recently bought a better camera, the Panasonic DC-G9. Panasonic offers a (beta) webcam software for this camera so you can use it as a 1280x720 webcam via USB 3.0. It also looks fantastic. The resolution is high enough! There is no audio though. For now i use a separate microphone.

If you care about your webcam's image quality, here's my main advice:

- Put the camera at eye height (using a small tripod)! This alone makes a big difference

- Put some light sources behind the camera and not behind the user

- If you want to use the virtual background feature of Zoom etc, use a real greenscreen behind you. They are very cheap and the quality of the virtual background with chroma keying is just so much better!


As for the design constraint that webcams have to be small to fit into the lids, I find the lid to be one of those places where I get the worst angle (both for lighting, and just in general), which is under your chin. I have to sit back a foot or two to make it look right.

Instead, what if there was a detachable part that was in the base that contained the webcam? Then you could hang it on the top lid (like normal place) or maybe set it on a table or higher up at any angle. I guess the next issue becomes powering it and getting the data there, but maybe just have a USB cable and solve both of them. I guess this would end up to be more like a go pro in the end though.

I have had some pretty good webcams in the past (the 1080p logitechs were pretty good), but they were about $100 then. But I think there won't be something that good in a laptop because the cost will just never be worth it. I wouldn't pay for it, I'd rather spend that money on a camera that won't turn useless in a few years.


You are describing a normal external webcam.


Do we need good webcams anymore?

Take a few selfies with phone -> reasonable deep fake avatar that commodity 720p sensors can animate. Bonus massive bandwidth savings. I think there was a 2m paper on such research. Isn't Apple already redirecting eye gaze to mimic staring at webcam when staring at screen.

Also get a tripod and use app that turn phone into webcam, feature should be baked into OS.


At my workplace, I’ve had effectively zero video calls since the pandemic began. We’re just using audio (occasionally someone will leave their camera on, but it’s uncommon).

Are video calls in other companies mandatory? More common than audio-only calls? Even at my previous job at a small tech startup, where the entire company was remote from day one, video chats were rare.


Same here, working from home since February, had exactly one video call, everything else was audio only.

Company size: ~1.2k employees, ~500-600 of those are developers


Video chats are more personal. We communicate a lot via body language.


Light is the reason good webcams are not built-in.

With still pictures you can expose for longer and/or use multiple low quality sensors to build up enough information for software to construct a higher quality image.

With video you cannot expose for longer, and software building a picture would need to be near real-time.

So problem #1 is light... you need to get a lot more light into the sensor, but the laptop is thin and optics are large (and as glass and light, they cannot be made much smaller).

I have a good set up, but the thing I tell people who want to replicate it: Buy lights before you buy a camera... your built-in cam may be sufficient if you are well lit. Or even cheaper, if you have a desk facing a wall... there's no light source, so turn it around or face a window.

Failing improved lighting fixing it... you need a good sensor and a lens that lets in a lot of light, especially if you are going to be in a poorly lit space.

Sony produced the absolutely perfect form factor for this... https://www.sony.co.uk/electronics/interchangeable-lens-came...

It's basically a sensor in a minimal body that you can mount a standard lens on and then place on a tripod (which opens up any mount / arm thing).

Why then is it not popular?

It came too early, and wasn't quite ready for always streaming video... both the software in this older revision and the hardware have issues (cooling the sensor for constant video is the problem here as it wasn't anticipated to be used to stream for 10+ hours per day and struggles beyond 30 minutes as I understand). Additionally it only has a micro-USB connection, no HDMI or USB-C... and it really needs a single USB-C that can both power and handle the video. Things it doesn't need: battery or flash - but I guess keeping them means it plausibly could be a stills camera if needed.

If that format included the equivalent of the Elgato CamLink HDMI to USB-C device and priced around $300 then I think it would be a wild success. The BOM suggests this is possible.

Sony already have a lot of the pro-streaming market along with Panasonic, but if Sony made a specialist product for home streaming then I would buy it instantly... as it would liberate my a7rii from the job.


To everyone who says 'good webcam must be expensive' and 'you need light first'.

No, and no. I am testing Reincubate Camo with my iPhone XS, and the image (together with computed portrait blur) is absolutely awesome, and several times better than any webcam tests I see. However, the set up is tedious.

First kickstarter to take an upper midrange smartphone sensor (<$40 BOM) and connect it to at least middling ISP and USB controller wins $100+ of my cash easily, and will be the best thing that happened to webcam market since its conception.

I have no idea how can Logitech sell atrocities like Brio and call them premium webcams.

And I am baffled by everyone who insist that hack from a DSLR, Elgato, and a bag of cables to keep it powered is that only possible solution. To me, this feels like the famous Dropbox comment: https://news.ycombinator.com/item?id=9224


Cameras are expensive. The low-end to mid-tier ones that we are used to in our devices are subsidised. Just like they cut the price of meat and bread to below cost to get you into the supermarket.

So it’s not a linear price scale because it is distorted at the lower end. There is no motivation for our benefactors to subsidise cameras beyond a certain quality so once you go beyond that you are paying full whack and the economics fall apart.

This principle applies pretty much right across the smart tech ecosystem.

The nice thing is though, that the stuff they are using to subsidise the expensive stuff is dirt cheap so if you’ve a little bit of smarts you can build your own gear for about a 10th of the price, but you will have to sacrifice some of the doo-dahs and bells and whistles but if what you’re going for is something that’s purely functional you will be fine. Use the money saved to buy an expensive streaming cam set up. Do the subsidising yourself.


I had the same exact problem when starting Summon The JSON project. The best comment I read was it is the most niche project he ever heard of. And in reality it is a rather small market for it. Moreover, the price has to be higher than what normally card games cost. Because when you don't order millions of decks, it is more expensive to manufacture, and moreover basic costs are also split into a smaller number of orders. It makes very hard for any innovation to go to the market. It really depends on if you will find a way to manufacture it in small quantity (what increases the cost) and if you will have market for it. Fortunately I have found an ambitious manufacturer that prints on demand. But it is a card deck. Hopefully there will be more manufacturers of electronic part willing to support startups with small order numbers. It blocks the market we don't have them enought


For me the most immediately noticeable difference between a webcam and a dSLR setup is FOV.

Every webcam use lenses with dSLR equivalent focal distance of 24mm to 12mm, which fits the user's upper torso naturally when placed atop a display, but lens that wide technically isn't suited for portraits. Portraits are usually taken from a distance with lenses with narrower FOV[1], typically 50 to 90mm in 135 equivalents or 45 to 25 degrees in FOV diagonally.

So to make a "good" webcam, the first step is to give it a "longer" lens if I'm going to do it. Trying a cheap clip-on "macro zoom" lens on a existing webcam, etc.

1: https://www.gettyimages.com/detail/news-photo/president-dona...


There is software that lets you use your DSLR or mirrorless digital camera as a webcam (though I haven’t been able to get it to work with certain video conferencing programs):

Cascable Pro Webcam (works with many brands) https://cascable.se/pro-webcam/

Canon EOS Webcam Utility https://www.usa.canon.com/internet/portal/us/home/support/se...

Nikon Webcam Utility https://www.nikonusa.com/en/learn-and-explore/webcam-utility...


You could just use a piece of software like IVCam in conjunction with a very good camera you already own...the one on your phone.

Sure, you'll need to rig up some kind of stand (like maybe a hands-free mount?) but once you do that you'll have a webcam with a pretty decent camera (Since you can use the rear camera)


In the typical videoconference meeting everything would work better if each participant is represented by a nicely-chosen photograph.

The client software can light up the borders to designate which audio streams are active and pop an icon of a hand when someone wants to get attention.

Save the video for chatting with your friends and family.


Webex does this. It’s alright, although there is always a bit of a feeling that non-camming participants have something to hide - be it a messy living-room, a pyjama, or a side-gig as property developer. This said, forcing webcam usage feels impolite here (UK), particularly (I’m told) for women, so it seems like, in moderately large companies, everyone just defaults to audio-only 99% of the time.


This is the mid-range webcam you might be looking for: https://www.amazon.com/AVerMedia-Streamer-Wide-Angle-Webcam-...

Following some independent YouTube review this seems currently the best webcam for streaming you can get.

Through it's focused on the streaming use-case and has some design decisions which are quite good for that use-case but make it sub-par (for the price) if used for some other use-cases.

EDIT: And like other people mentioned sub-optimal lighting is often a major problem, so investing in any more expensive webcame doesn't make that much sense if you don't also invest in improving the light situation.


It seems there are some problems with this camera.

I read/saw some positive reviews including at least one which I don't think was buyable.

Which opens interesting questions, like if the review sample isn't the same as the normally sold version or if I misjudged that person :-(


I’m surprised nobody has mentioned that a huge hurdle to wanting a nicer webcam is recording at higher quality doesn’t matter. Zoom is limited to 1080p and even then you’ll rarely actually get it due to various limitations. Zoom for our (large) company limits streaming to around 640x320


You will very much see the difference between high quality and low quality optics/sensors, even at fairly low resolution.


Zoom should be the Company that builds this product. Also, the Camera you tried to build pretty much exists http://www.marshall-usa.com/cameras/CV503-U3/


Might help if there was a standard webcam physical attachment (same idea as the camera screw for a tripod). Then each monitor could have it. So adding a new webcam would not add bunch of clutter to your desktop. And would avoid the expense of buying a webcam holder.


I went down this rabbit hole at the start of the pandemic and it's more complex. The thing that makes for a good web-cam is a great lens. The sensor and body of the camera is mostly commodity and contributes at max $50-$100. A good lens with shallow depth of field will cost $500-$700. So the best case current solution would be $700+ for something that will look as good as a high-quality DSLR without the body complexity.

I designed a prototype of this type of camera and I think there's a market but it's difficult to defend. You end up competing with Logitech or Canon, GoPro & Apple enter the market.

It's more likely that cell-phone quality cameras start being added to laptops and the shallow depth of field is done in software.


Why is your lens 3x the price of a pro-am Autofocus prime or zoomablr DSLR lens?


what about camcorders? is there any model that also acts as a webcam? they’re pretty cheap nowadays


I'm no expert, but there do seem to be cameras in this space. The Logitech Brio and Aluratek Live Pro 4K in the $200 range, for example. The article doesn't "name names", so I can't tell if these are the models the author didn't like.


It's interesting that he calls out Fuji explicitly not allowing webcams, as Fuji did add webcam features for both their X-T3 and X-T4 models for both windows and mac back in September of 2020: https://fujifilm-x.com/en-us/support/download/software/x-web...

The reason they did not back port it to the X-T1 and X-T2, my guess, is that the T1 and T2 use much older (circa 2014) cpu/software, whereas the T3/T4 use the same family of ~2018-2020 CPU/software/sensor and were already rolling out a new major firmware update for the X-T3.


This article does seem to skirt around the obvious, that you _can_ buy a decent webcam, if you pay Logitech the money they want. Given how often they've been sold out this year, people are buying them.

Both the C920[1] and the StreamCam[2] work well.

1 - https://www.logitech.com/en-gb/products/webcams/c920-pro-hd-...

2 - https://www.logitech.com/en-gb/products/webcams/streamcam.96...


C920 is literally 8 years old.


I encountered this problem in college when I worked on some custom motion-capture systems. My priority then was not sound, light, or necessarily video quality, but speed. Every prosumer webcam I tested from Best Buy or Amazon had horrible motion blur.

My solution was to buy a bunch of used Playstation Eye cameras, developed by Sony for Kinect-like games on the PS3. Sony had designed it from the ground up for low-latency motion-capture, and it performed remarkably well in this application. The PS3 Eye is still popular today with DIY mocap systems [0].

[0] http://ipisoft.com/support/


My entry-level Canon DSLR (EOS 550D) can stream over USB, although it's not "plug and play", at least not on Linux. To make Chrome detect it as a webcam I need to go through gphoto2 -> ffmpeg -> v4l2loopback pipeline.


usb resolution is poorer to start with. older cameras cannot run for 30+ mins before overheating.


You can save yourself some trouble by using a HDMI capture device. I use one with my Sony a7ii with good results.


which means l

aaaag


I've got the Brio and I'm not impressed by the quality, but then again, it works pretty well out of the box for video calls or general recordings, so I guess it's ok. Not sure if I could have gone much cheaper and still had something that's better than the built-in camera of my laptop.

My first choice would have been a real camera (my mirrorless is a bit too old for 4k and being usable as a web cam over USB) and a real microphone (this was a good investment) and adding an extra EUR 200+ to such a setup would be nicer than the Brio, but when I travel more I'll be happy to have it.


As pointed out by others, a cheap HDMI to USB converter unlocks many possibilities. I use a Sony RX100 on a tripod plugged into a $20 eBay dongle. Works flawlessly cross platform, no drawbacks except there is no mic.


Overall, this is a pretty good article, and I always found the supply chain difficulties and sourcing as a bit of a black art.

My only criticist is, if anyone is going to do this, please just focus om the camera. Forget some neural processing chip or microphones - just get the video right. A 4K webcam that does nothing but video has value in and of itself, but if you dont nail that, all the bells and whistles don't matter.

'camera with an ugly protruding lens as a thickest part of the phone'

Never understood this attitude, give me a decent camera, I don't care how much it protrudes unless it stabs people.


I'm really frustrated with the enormous field of view of most webcams. Many are 90+ degrees. My old 2012 Logitech C310 is 60 degrees. I want to show myself in the view, not my entire room/house. I finally found a cheap AUSDOM AW635 which is 60 degrees on eBay. Image quality is acceptable for $20 and its manual focus which took a moment to dial in but is ncie since the focus doesn't jump around during a call.

I'd buy a higher end webcam if they had adjustable FOV (without software, since the software doesn't work across all OS/programs).


At the particular market segment they seem to be aiming for - the enthusiast middle ground - the cost of extra capability is not competing with higher/lower priced cameras.

It is competing with environment control on price. If I can change lighting with a $10 lamp or two and clean up the space behind me then why do I need "low light capability" and shallow depth of field?

When this middle ground is marketed, it should be against the convenience of NOT having to arrange your surroundings "Just so".


Has anyone investigated buying a top tier android phone from like 2015 or so and then just making that a permanent webcam? Surely that would be relatively cheap and high quality.


I've seen that a few times on /r/battlestations recently. Most of the posters say the quality can't be beat but it's a pain to get set up initially and building a good enough mount for it can be difficult as well.


I wonder why it is so hard to find a webcam with USB 3.0 interface that can do something better than 1080p 30fps YUV. I understand in video call applications the video stream is compressed to hell anyways, but I am just trying to make my computer able to take photos better than a 20 years old point and shoot. Now I am using a HDMI capture card with my mirrorless camera. It may seem overkill but the picture quality definitely blows any "webcam" out of the water.


Buy a Canon M50 and use it as a web cam. You can use this guide: https://www.youtube.com/watch?v=eISFCeuiCik

What we need in high technology is a website to manifest our needs and let companies know about underserved use cases. Yes, it's pathetic that Logitech and Apple don't have compelling products in this field but, sadly, it is what it is.


Regarding microphone arrays, is there a good description somewhere online which explains the mathematics of picking out the sound from one place using a mic array?


I didn’t find any reference but the algorithm is approximately this:

1. Assume there is one point source of sound which dominates everything else

2. Input two frames of samples, s and t, and compute cross correlation: a measure of how similar they are for a given delay between samples

  n(k) := sum(s(i) t(i+k)) / (s•s)(t•t)
3. Find the k which maximises n(k)

4. Do some beamforming to improve the signal to noise ratio. This sounds complicated but basically you just shift t by k samples and average it with s.

I’m not really sure how this works if k changes over time, but I guess something could be done to change it in a more continuous way.

I also don’t really know how it works with more than two microphones or if you know the geometry of the array.


I'm not sure you can't, but you might be looking in the wrong market space. There's an entire industry of "machine vision" that has very high end video cameras and lenses. And then there's even a step beyond that level, to "scientific" imaging components. You can choose your interface: USB, firewire, etc. Many of these cameras come with API's so you can write your own software for them.


> It's easy to buy some "random" sensor that was around for years and most certainly has "meh" quality by today's standards, but we need a good one, with good dynamic range, HDR capabilities and low-light performance.

So maybe it's a matter of time until these "good" sensors reach the open market, can be bought in smaller quantities and become commoditised. Then, we may see a good upmarket webcam.


The sensors have been around for ages, nothing stops you using same sensors that are in phone cameras.

But you have to build a chip to control the damn thing over usb. I coupd not find any USB3 webcam except logitech brio


Does anyone remember iSight? For the time the quality was great, and it was also really nice looking on top of a monitor. It may have been a high-water-mark.


The quality was great for the time: 640×480-pixel VGA resolution

https://en.wikipedia.org/wiki/ISight


Wasn't here on HN like a month ago an article how to setup a Raspberry Pi as an USB camera? With everything in it (software on github, BoM, wiring/PCB. Can't find it now.

The price would be around $200 sans good audio, not that it matter anyway since in a conference I always ask for headsets. Also the setup was so good that the RPi would've been recognized as an USB camera by another RPi directly - plug'n'play.


One I've been using recently is a Kinect v2; the image is a decent 1080p but with OBS I'm able to give a way better green screen effect than the one built into Zoom. Has the added bonuses of being a 3D scanner, and semi-functional full body tracker for VR. Cost £70 including the PC cable, which seems to be the going rate.

A Pi Zero with the high def camera module is probably a decent shout too, although I've not used it.


Hmm...I only have a v1 from the days of playing around with interactive projection stuff. For a while it seemed like the v2 was expensive and/or hard to get with the cable. Also have an early Intel RealSense camera around somewhere but I don't know how well supported it would be. I know OBS has specific Kinect functionality built in so that sounds like it would be interesting to mess with.


100% just got it to mess around with. Cable for V2 seems to be about £20 nowadays (mine was official but most seem to not be, doubt it makes a difference), most stuff online seems to talk about it being multiples of that. The step up in tracking ability and image quality from Kinect V1 is absolutely gigantic. However, you can very easily get a v1 plus adapter for under £20 in total and it probably covers most of what people would want to play with.

Dunno about built in functionality, I used a plugin. MaxMSP has Kinect functionality via a plugin too, which I haven't dabbled in but looks pretty fun. There's a node library for v2 as well I believe.


> A lot of what makes a picture great comes from lighting the scene in a correct way.

IDK why the author thinks a "good" webcam should be able to do 4k, just to show one's face??

But yes, lightning is paramount.

I installed custom lightning in front of my desk and with a cheap Logitech webcam I get very good image quality.

Basic photographic lightning kits are cheap; I paid $80 7 years ago for 2 x (stand+light+bulb+umbrella) and it's still working perfectly.


> IDK why the author thinks a "good" webcam should be able to do 4k, just to show one's face??

I don't think it's very useful. I think it will be very problematic to sell a webcam for $250-300 that can't do 4k.


The same way there is a protuberance to fit large lenses on the back of phones, there should be one on top of laptop monitors. The exterior of the laptop wouldn't change but we would be able to get a smartphone-like camera in a laptop. Add in features such as an led strip to improve lighting, market the product to people who use their laptop mostly for video conferencing and that surely would be a hit.


> Why can't you buy a good webcam?

Why can't you buy a good frozen dinner? Because frozen dinners are designed to be simple and affordable meals, just like how webcams are designed to be simple and affordable image capture devices.

If you're willing to spend more for a good alternative to a webcam, buy a captured card and a camera with support for live video output. You could even use the primary camera on a smartphone.


Tangent: I really wish that Apple would make it easier to use the iPhone camera as a webcam when using a Mac. It would be a killer feature since so many people are in the ecosystem and iPhone cameras completely outperform webcams these days. I tired EpocCam but I wasn't completely satisfied with it. This seems like a missed opportunity for Apple, given their seamless integration experience mantra.


There’s an app called EpocCam that lets you turn the iPhone into a webcam. I was disappointed with the results. The colors get washed out easily, especially in strong direct light. It’s nothing like the image quality you get when taking a video (which was what I expected).


Same, I tried it and was disappointed in the results. Nobody is better-positioned to wirelessly integrate the iPhone camera and a Mac better than Apple is, and it would encourage folks to stay in the Apple ecosystem.


Check out Camo (https://reincubate.com/camo/). I also had problems with EpocCam and switched earlier this year with an old iPhone 6 and have been really happy. $40/yr isn't super cheap but its easier and higher quality than me buying a webcam.


If you're looking for better lighting, I recommend this lamp (or similar design): https://www.amazon.com/PHIVE-Architect-Bright-Drafting-Brigh...

I have it placed directly above my web cam. Since it's a line of LEDs, it doesn't cast hard shadows.


Seems like what we are all after is a good software mod that turns your phone into a good quality, USB-connected camera. I have tried some but they create a cumbersome webstreaming service that ads delay and compression overhead.

The sound would not be great, granted, but that can be easily fixed with a separate microphone connected to the PC or the phone itself, a lavalier microphone with Bluetooth etc.


I used Logitech C922. The image quality was really bad. Now using Sony A6000 + Elgato HD60 S plus. It's an expensive setup but works great.


My simple tip is to buy a $15-20 wired lapel/lavalier mic from Amazon and use it instead of whatever crappy audio your laptop or monitor or webcam offers. It pics up your voice so much clearer and makes a world of difference. You can still use the speaker outputs or whatever you prefer, but the mic takes it to a better level of quality for people listening to you speak.


If you want really good sound quality, you should be using separate audio gear. No on-camera mic is going to perform well due to various cost and physics based limitations. Look to what podcasters use, and buy that for the audio half. You can put a better mic into a webcam, but it'll always have background noise issues compared to a mic that's closer to your mouth.


I recently found that old smart phones make better dash cams than the purpose built dash cams sold in stores. I turned a two year old Samsung J3 into a dash cam. The phone cost 90 bucks two years ago when it was new. The dash cam app was 4.99 and the windshield mount was 6 bucks. The video is way better quality and there are more options/settings to play with.


> The simplest thing is to sit in a room so the window is in front of you and the Sun isn’t shining into a lens from behind your shoulder.

Unfortunately, this is a rather bad idea from workplace security point of view. You really don't wanna be looking at a screen that's in front of a window (i.e. parallel to the window), unless you like excessive eye strain.


There is actually a good webcam now. It's AverMedia PW513. It has a Sony Exmor 4K sensor, and generally favorable reviews.


Hardware looks good, but the software/firmware might not be great.

In this review, there are issues with the colours and the auto exposure.

https://youtu.be/wO-H3tChhQ4?t=210

Also it is unknown how well it works on Linux.


Review with actual footage made with this webcam https://www.youtube.com/watch?v=H5MfenW7EqA

Wow, it sure does look like quality webcam, the only thing someone might miss is custom lens with fancy bokeh.

Edit: Watched to the end, it does fake 1080@60 by upscaling 720 :( So still a good choice if 1 you dont need 60Hz 2 dont mind being scammed by a Taiwanese company.


Also so-called “gamer” hardware. Like GPUs for scientific computing and high quality, actively cooled WiFi routers, it’s a weird and kind of tedious world to navigate but quality hardware is there if you know where to look.

https://www.razer.com/streaming-cameras/razer-kiyo/RZ19-0232...


>1080 resolution at 30fps

lens the size of needle tip and example images looking like ass


Wow it's been many years since I last heard about AverMedia. I used to have an AverMedia AverTV capture card back in WinXP times, and it worked really well with composite video input from my Handycam lol.


>Even if I think that 4k is not needed, I’m pretty sure it’s really a requirement to have in an upmarket webcam, as it will be really hard to sell a pricey webcam that can only do 1080p.

4K is absolutely not needed. You won't be streaming 4K from your home or average office space anyway. Our pipes are just not that fat yet in upstream direction.


Would you buy a webcam with 1080p only (I promise it's good though!) for $250-$300?


Assuming it's good, why not? Even 720p video from a DSLR will look far better than 4K video from a Logitech Brio. Resolution is almost meaningless when doing video calls. Most people are in environments with poor lighting, so they need bigger sensors and improved lighting long before 4K or even 1080p resolutions.


With all features described, yes, I would.

And please, can I add another one for $50? I can easily connect it to my decent-hardware-but-shitty-camera midrange smartphone and use instead of built-in camera.


Well, Android site says that it supports UVC cameras over USB, so presumably you get this feature for free!


Similarly, why can't you buy a good speakerphone?

It seems like all of the professional quality stuff is not amenable to just plugging in at home. I have a Jabra speakermic thing that's passable, but not as good as something in a real conference room.

I, unfortunately, cannot wear something on or in my ears all day, or a high-quality audio setup would be easy.


I've been very happy with this model: https://www.logitech.com/en-us/products/webcams/c930e-busine...

It has excellent optics, and a piezo zoom. Worth every cent.


I thought I'd mention that you can convert a Wyze cam to a webcam with this firmware update.

https://wyzelabs.zendesk.com/hc/en-us/articles/360041605111-...


The answer is because it’s a pretty narrow market. Anyone wanting good quality has purchased a camera (or has one), external mic, and cobbled it together themselves.

Or they buy a laptop with a decent camera and add a mic.

So your market is the niche between those groups, and those who make do with their laptop.

Not a large enough opportunity to pursue, I don’t think.


I really wish a webcam could be embedded inside the screen (like the under-display front cams that are just appearing on mobiles). Being able to actually look someone in the eye instead of staring down would be great.

I was even thinking of making something like a teleprompter with a semi mirror but it becomes too bulky :)


The author mentions a Fuji X-T1 and the fact that you can't use it as a webcam, but fails to mention that it's quite old (2014) and that it's the only X-Tx model not supported by Fuji's new X Webcam software, which turns your not-so-old camera into an actual high quality webcam.


What about using an action cam? Picture quality is quite good these days even for the cheaper chinese ones. Audio might not be too bad either. As others have said not sure whether audio with the webcam is so important. It will we relatively far away, which might not be ideal for sound quality .


This camera seems to check all of the boxes for what he was looking for. https://thorbroadcast.com/product/4k-hdmi-and-usb-e-ptz-comp...


Bought a Logitech Brio 4k Streaming before Covid, because I wanted to try out YouTube and using my DSLR was a pain.

Its 200€(pre lockdown) and is just a webcam on steroids.

Now I'm glad I didn't sold it. Almost anyone comments on the image quality - especially when I turned on my 4000lm daylight Phillip's Tornado


It always baffles me that high-end webcams also usually come with bad microphones built-in. If you're dropping $200+ on a fancy high-end webcam for either streaming or high-quality video calls, you're probably also going to be dropping $60+ on a decent stand-alone microphone.


Some friends use ipads as their webcams, I've switched to a Nest something hub. It's nice to make video calls on a different device, as my laptop doesn't run its fans at crazy speeds and noise. The video and audio quality from the nest hub max whatsizname is great.


I don't understand the desire here.

The classic Logitech C920 ($80) is an excellent true 1080p webcam. [1] It's been around for years.

Who needs better than 1080p from a webcam?

You can manually adjust focus, exposure, color temperature, shutter speed, zoom, everything you need.

Once you've got that, everything else is lighting. Most people drastically underestimate how much of photo/video quality is actually about lighting, as opposed to the camera or lens.

As for audio, if you're investing in the higher quality of an external webcam anyways, then you'll never want a microphone in the webcam. The placement is inherently bad.

You'll want either a lapel mic for normal interviews or meetings, or a podcasting mic on a desk tripod for more serious stuff. A mounted shotgun mic is a kind of in-between option too if you don't want it visible.

[1] https://www.logitech.com/en-us/products/webcams/c920-pro-hd-...


I'm surprised that Apple in 2020 still ships magnificent M1-powered Macbooks with crappy webcams.


I'm pretty happy with my Logitech C920 HD - never had any issues with picture or sound quality.


This sounds very much like the market that Webex Desk Camera is targeting, but I don't know the pricing has been announced.

(https://www.webex.com/desk-camera-sign-up.html)


You can buy one, just don't look for a webcam. What you need is good camera and lenses. I'm using Sony A6400 with decent lenses and video quality is amazing, but this setup probably costs much more then most people would be willing to spend on a webcam.


This is a crowd funded deluxe webcam from earlier this year: https://www.indiegogo.com/projects/obsbot-tiny-ai-powered-pt...


My solution: Download DroidCam, use your phone's camera as a webcam. Night and day difference versus awful built-in laptop cameras, only takes a few minutes to pair the first time.

Granted, the positioning is a bit annoying. If I did it more often, I'd get a tripod.


I've got DroidCam X and it's very flaky with random freezes and disconnects, where I have to relaunch the PC-side app.


I do this with Camo. Same idea. An old SE 1st-Gen and Velcro on the back of one of my monitors solves the positioning issue.


There a significant added lag though, with audio and lips out of sync.


The M1 Macbooks have a decent webcam, if you can stand machine learning "interference" in your video stream that is.

Beyond that, the best is to use a stand for a recent phone rear camera (you can use a separate account and invite the phone to your meeting).


There's a software solution to this: https://reincubate.com/camo/

Use an old phone for your webcam. The quality is terrific, up to 4k. You just need a phone stand.


I have tried to use use modern phones as dedicated webcams. The power consumption of constant video capture, compression, and streaming is greater than the phone's ability to charge. Once the battery is depleted the phone can't operate directly off of the charger and shuts off. And that's only if it didn't already shut off due to overheating first.


This is a phenomenal product, I've been using an old iPhone 6s and the quality is very good.

The license subscription is more than I'd prefer to pay, but considering every passable webcam has been sold out since April, it's much cheaper than some disposable USB webcam with horrible quality.


> you also can buy a great XLR microphone for $100, but then you’ll need an external sound card with phantom power supply

XLR does not imply need for phantom power. Some XLR microphones do, some don't. There are XLR equipped audio cards for as little as $80.


> [to use blah mic] you’ll need an external sound card with phantom power supply. A decent one can easily cost you $200

I got a decent one for $40. The more expensive ones generally have more in/outputs and such, they're not better in sound quality.


Can't believe nobody has mentioned just getting an IP camera for $50. 4K video, IR and low light sensitive, typically multiple codecs and streams simultaneously. Seriously, why waste time with consumer class USB cameras?


Probably the best midrange webcam you could get would be a good mount for your smartphone and some decent software. I use my phone and a headset for zoom meetings - I use software to add the phone as a soft webcam to my PC.


I finally ended buying a cheap (15€) hdmi-usb capture for my dslr and the quality is abysmal.

Sure canon released an usb support for it, but all the usb cameras I've tried end up using a lot of cpu. With the capture card 0 problems.


Do you mean “abysmal”? Your first sentence says that a capture device was a bad solution, while the next paragraph seems to imply that it was a good solution.


This is precisely why I have been using my Pixel 4 together with Iriun. Works great and the picture quality is better than everyone else on video calls who are using either Logitech webcams or the built-in MBP camera.


My understanding is BRIO ULTRA HD PRO BUSINESS WEBCAM is the best webcam in the market which I got, if you want to move up Sony ZV-1. Any one thing other way please let me know before I buy Sony ZV-1 :)


Considering the iPhone has a good camera, has any one used https://www.elgato.com/en/epoccam ?


I use a USB HDMI capture device and my iPhone over the Apple lightning/hdmi adaptor(the knockoff I tried first didn't work well). This along with Filmic Pro and the quality is amazing.


Got a Sony hybrid camera that's basically plug and play after you install some software. It does 4k and event though most video conf tools only allow 720p, the image comes out stunning.


Even Facebook Portal TV sucks. Slow, low res, awe full tracking, even worse UX. And it sees the reflections from the TV in my art on the wall and tries to track it. So maybe start there.


I use a 10 year old iPhone 4S with OBS Studio and a $15 app to enable streaming from the phone's camera. It cost me not much in used condition, and has excellent image quality.


sounds to me the question is why can't you buy an _high-end_ streaming camera with integrated microphone for an _affordable price_.

I use Logitech BRIO Ultra HD Webcam[1] along with Headphones and very very happy for all my needs. From calls to recording training sessions and demos.

[1] https://www.amazon.com/Logitech-BRIO-Conferencing-Recording-...


Brio is better than most webcams, still worse than all top smartphones.


A lot of people use iPads or phones for this which have better cameras. If you are using a computer or laptop you can just install an app and use your phone as a webcam.


you can buy a good webcam. its unfortunate that to get decent image quality you need a good cammera. here is a tip you can skimp on the hdmi to usb dongle it digital 15 dollars should do it.(elgato seems to be a vlogger conspiricy) i found the best results from a cheapish dslr (that i already had) with a fixed fast macro lens at F3.5. best value is probably a mirrorles with a decent lens. when your in front of a big monitor less need for ring lights.


I bought a DSLR with the plan of using it as a webcam most of the time. It's worked out well, but I am sick of the zoom wave, cant wait till it dies down.


A used GoPro is probably fine for people who want more than a junk camera but less than some pro level stuff.

I wonder how easy it is to get OS to treat a GoPro as a webcam?


Don't the Logitech Brio Ultra HD pro or Streamcams (if 1080p is enough) roughly tick the boxes the author wanted? Am I missing something?


A software solution to use your phone's camera and mic as a webcam is ManyCam, it's a great desktop and phone app for streaming.


I gave up and just use https://reincubate.com/camo


Shameless plug: We're making good software defined cameras at huddly.com. "Plain" uvc, so it works on your favorite os.


50% OT: I was wondering why hooking cameras into the police system is reserved for 1000+ euro cams + expensive private security firms


Is it possible to just use a phone as a webcam? The rear camera of recent phones should be much better than a standard webcam.


Yes, it's possible. You can find a lot of apps doing exactly this.


Maybe the cause is wealth inequality: The person in question doesn't have the money needed to implement his plan. Some people have the millions or hundreds of millions of dollars needed to do something like this, but they spend it on very dumb stuff. If more wealth was redistributed, then he could take this risk.

Instead we have clueless VCs investing their money and founders spending excessive time convincing VCs and risking their money. There's no more skin in the game.


I'd start with a cheap camera focused Android phone, they start at £200. Plenty of apps to turn them into IP webcams too.


This doesn't seem particularly insightful. The author managed to get in a couple of completely irrelevant adverts for his startup and his brother's twitch channel.

Save yourself some time - the titular question is answered (with the obvious, unsurprising answer) in the penultimate sentence: "you can't buy a good webcam because existing players seemingly don't think it's a good opportunity, and for newcomers the market is hard to enter."


I've been happy with the quality of the Logitech 4K Pro sold by Apple. Works great out of the box with linux.


I know the market for it is very small but it's practically impossible to find a webcam without a microphone.


PS3 Eye was a good webcam and there are plenty of them out there. Too bad nobody wrote good drivers for them.


I assumed everyone would be using a GoPro or something similar as their webcam if they needed a good webcam?


1) you can use your dSLR as a webcam 2) you can use your mobile phone as a webcam

Not sure why you would want to buy a webcam


Bought a Logitech C922 - what a image quality disaster. So much worse than my iMac Pro.

I now use a Sony ZV1 as a webcam.


Literally any DSLR will let you do this. It's what 90% of Youtubers and streamers use.


And here i was thinking , you'd actually build one. Great article though


I too wanted the article to be a luring into my Kickstarter. Won't happen, though


Or you know, get a used sony a6000, a capture card, and a microphone.


every smart phone has a pretty-good-camera. Shouldn't there be an easy way to use an iPhone or Android phone as a webcam?


you most likely have pretty good webcam, it's the thing called smartphone on your desk, just hook it up to computer


I'm not AV geek but isn't it also because HD transfer is too high bandwidth for USB? You'd need a HDMI capture card instead of a USB controller.


Perhaps buy a tripod for a phone instead?


Logitech Streamcam is pretty good.


Which webcam can this crowd recommend?

bonus question: Why do you recommend that particular model?


@Apple MacBook Pro


Test


The real, real reason you can't buy good webcams is because the market isn't a meritocracy. Make the best webcam in existence and there just isn't a robust, or honest, review community to let it shine.

Instead there are a billion "here's a list of things that we can get affiliate dollars for from Amazon, and we'll pretend we reviewed them" listings.

This problem has hit a lot of markets. Without a financial support for capable, credible, honest reviewers, the backbone of an industry falls apart. It becomes all about gimmicks.

So if you want a webcam the real options now are an SLR (many of them support that use) or even a video camera. In those markets there remains an extreme qualitative review system, so that new Nikon, Canon or Sony SLR or video camera had to shine.


> Make the best webcam in existence and there just isn't a robust, or honest, review community to let it shine. Instead there are a billion "here's a list of things that we can get affiliate dollars for from Amazon, and we'll pretend we reviewed them" listings.

Wirecutter?


Wirecutter just buys a popular brand name and says they like it.

They don't do any careful tests or write up details.


Have you ever read one of their reviews? That's exactly what they do.


oh yeah, the webcams suck big time. even the expensive ones. just get a dslr over hdmi. it's 100x better and used dslr is cheap these days. you can even get panasonic gh4/5 which is a real video camera.


Can you recommend any dslr models or what to look out for when buying used with the intention of only using it as a webcam?


You actually want a mirrorless camera, not a DSLR one. If you intend to only use it as a webcam, probably just buy Panasonic G7 or Canon M200 for $500.


Canon M200, Panasonic G7, Sony a5100 (oldest but can be bought for cheap and has good autofocus).


This seems like a good opportunity for Wyze. They have some great experience with Cameras and have sold quite a few.


People are addicted to cheap garbage from China.


Take a modern phone, use an app and connect the phone as a webcam.

This way you save money, time and a blog post (:


If only you'd read the blog post, you would have saved yourself money, time, and a comment on HN.

He talks about phone-connected webcams using an app.


What article?


It is covered in the article




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: