Unfortunately, many smart TVs use 100 Mbps RJ45 ports, so WiFi is usually faster. Some TVs allow you to plug in a USB to RJ45 adapter, but most of the USB ports are only USB 2 speed, so the practical limit is 300 Mbps. Some nice ones have USB 3 ports enabling 1 Gbps through an RJ45 adapter. I wish the manufacturers put at least gigabit (if not 2.5 Gbps) RJ45 ports on TVs.
I would never plug a smart TV into any network, but regardless of that - when you have an individual endpoint like the smart TV (or the Apple TV, Android device, etc), what's the practical advantage of a 2.5Gbit port on that device? You aren't able to watch the movie/TV show any faster.. so you're constrained to the data rate of the content you're consuming at ~1 to ~1.5x speed (if you are in to that sort of thing).
Taking a 4k stream as an example, a compressed 4k stream is not going to exceed about 50Mbps, so even a 100Mbps data rate will have a 2x safety factor - and since you're wired, you're going to get full use of that data rate unlike Wi-Fi. You're not streaming uncompressed video as that would require more than 2.5Gbps, and if you want to upgrade to 8k video, you'll need a new TV anyway...
I always wire in my TV and other fixed infrastructure because, since they aren't moving, there's no advantage to using a data layer protocol that, by definition, enables mobile access. In addition, latency and jitter is always better on wire versus wireless, and keeping these devices off the wireless frees up precious airtime that the rest of my devices that do move can use instead. It's a win-win-win all around.
> Taking a 4k stream as an example, a compressed 4k stream is not going to exceed about 50Mbps, so even a 100Mbps data rate will have a 2x safety factor
That’s average throughput. Peak can be much higher, especially on high bitrate content and remuxes, and especially with newer HD audio formats. Modern smart TVs have a pitifully short buffer, so you can run into problems. I did on my Sony, and switching to wifi solved it.
The built in port has DMA. USB to Ethernet adapters don’t and have to go through the CPU first. You’re stressing the CPU for bandwidth you aren’t actually using.