Been using Synergy and recommending it for many years, paid for it a long time ago as I thought it was worth supporting given how much use I've gotten from it.
Besides being OS, what are the advantages of Barrier? Worth the (probably small amount of) time to switch?
I paid for Synergy, but couldn't get it to work across all of the OSs I use. Synergy provided a refund, which was nice.
Barrier worked fine for my multi-OS use case (Windows, MacOS, various Linux distros).
For some reason, I had to restart Barrier periodically when I was on mesh wifi (by Vilo), but when I switched to a different wifi AP (Starlink's actually) Barrier worked consistently.
> Compatibility. We use more than one operating system and you probably do, too. Windows, OSX, Linux, FreeBSD... Barrier should "just work". We will also have our eye on Wayland when the time comes.
It's great. I run a desktop running Windows with WSL2 for Linux and a MacBook Pro beside it. With some combination of a shared mouse, keyboard, clipboard, SSH, and SSHFS / NFS the experience is pretty seamless.
I'm on Windows 10/11 here, and I tried Barrier, having used Synergy so long ago I was using CRT's and wasn't out of date. I could not get Barrier configured. It just didn't work. I ended up just purchasing Synergy for $30, and it worked immediately, and now I have my 15.6" laptop with a 11" Surface Go sitting next to it, and can copy/paste between them and use the Surface like a second monitor, but it's actually a second computer with its own processing power, which is pretty nice.
Definitely felt like a pretty low cost. I've worked with their support briefly also which helped me resolve an issue I was having (been so long ago now I honestly don't remember what it was). I'm literally typing on my Windows PC from my Mac laptop right now. I used to use it a lot for testing Windows browsers (less needed now that IE is dead and Edge is Chromium).
The only issue I really have now is once in a while I can't use my keyboard on the client computer(s), and that is because my server is currently my work Mac, and the terminal will occasionally request secure lock (probably when I use sudo) and as part of that, the keyboard is not allowed to be accessed by programs like Synergy until the lock is released. But Mac gets kinda buggy once in a while and doesn't properly release the lock.
Most of the time it just works though, and I've gotten way more than my $30 worth at this point.
I've given a squint eyed sideways glance at the forks ever since part of the forking contingent tried to reference the author's username "debauchee" as a reason it should be forked, because the name could be seen as offensive.
"The name debauchee is not professional and should not be associated to a popular project."
From the github thread linked above. How disgraceful.
I helped my friend set this up back in the day so he could have WoW running on one computer and the wiki open on the other. IIRC he didn't have enough RAM on his main computer so running FireFox in the background tanked his FPS.
Anyone used MicrosoftGarage Mouse Without Borders and can provide some comparison based on experience / tips ?
I have used Mouse Without Borders across work and personal Windows machines -- mostly works with occassional quirks. I think there is some latency though when mouse input is routed through another PC.
I use Barrier all the time, but it has a tendency to lock up on OSX and I have to use a physical kb/mouse switch to go fix it occasionally. (Or I could remote in with a program, I guess.)
I am surprised no one has mentioned the new Gnome sharing Extend feature[1]. It also uses RDP, but you enable it and it "just works," including input events from the remote device onto the virtual screen. I've used Synergy/Barrier/Waynergy for years, but this is a good way to add a screen anywhere that supports RDP (Android, ChromeOS, etc). My main reason to explore it is to extend to a nice high resolution OLED tablet. Latency is quite low, as long as you don't try to watch a high resolution video.
I feel like all the parts of linux are there for an amazing desktop(its already great), its just sometimes the discoverability/documentation/usability need a little push.
I've had trouble accessing Ubuntu over Gnome RDP sharing, except from a Windows machine. I can access Windows machines via RDP using e.g. Remmina, but accessing an Ubuntu host only seems to work for me from Windows regardless of the client I use.
Like, what client running on Ubuntu though? Remmina? I don't mean the RDP server setup, which is pretty self explanatory in settings. I weirdly can't make Ubuntu --> Ubuntu work, but with either host or client as a Windows machine I have no problem.
Anybody have tests of the latency? I haven't found anything better than parsec that is free and open source. Parsec uses their own UDP protocol, and makes use of hardware encoder/decoder.
Moonlight as a protocol is quite cool, made for streaming games in fact. Sunshine is a open source server implementation, and the client was open source to begin with, I think. Sunshine is multi-platform, and so is Moonlight the client. I used it to stream my Linux desktop (and games) to my Android phone.
I also have good experience with Spice. I haven't found a way to set it up as a server, but Qemu uses it for all its VMs, and the performance is great, you can watch videos and everything, if you have the bandwidth.
For WiFi and slower networks, Moonlight was way more performant for me.
The problem with UDP for video is that, on a lossy network, you either end up sending a lot of key frames or reinventing TCP.
There is a trick for achieving low latency video with TCP: set SO_SNDBUF to the lowest possible value and do your own buffering. If your buffer grows too large, lower the bitrate and/or drop frames.
I'd argue that on a lossy network you will never have decent video (with decent latency) so for interactive sessions you don't really have to care about lossy network, it isn't worth the hassle anyway.
That is true, the network needs to be stable in any case, but the keyframes have "bursty" traffic, therefore increasing the latency compared to b-frames (and p-frames)
Ever since h264 there is a feature called intra-refresh, which allows you spread key frames over time, reducing the burstiness of the transmission (and therefore improving latency in most scenarios)
Could be a problem if you need to encode other things, since hardware encoders are artificially limited on nvidia and probably others. Nvidia did increase the amount of concurrent streams at the start of the pandemic though and the limit may also be partly imposed due to patent licensing and not just Nvidia's price discrimination.
When I used Parsec to control a machine on the other side of my room, the monitor and my client were not even 1 frame out of sync. The only downside is the heavy compression to achieve that. There is also no color accuracy.
Has freerdp gotten so much better? About 5 years ago, tigervnc was much smoother than freerdp when I did the same thing. It almost felt like freerdp was just a translation layer on top of vnc.
As a protocol RDP is way ahead of VNC. The latest versions support H264 video encoding and AAC audio. With a beefy Windows server and FreeRDP as a client, I can stream 1080P YouTube from a remote desktop over the Internet at 60FPS. RFX progressive encoding isn't bad either. The trouble with FreeRDP, at least the older versions, is that you have to explicitly enable these features. I tested VNC in this setup, and it was unusable, about 1s frame rate. Something about this multi-monitor setup just doesn't play well with the VNC software that I tested--x11vnc server and tigervnc client, and I have used those often in other scenarios.
I feel that the VNC protocol gets a bad reputation because of bad implementations. There's really nothing in the protocol itself that keeps it back.
What I like about VNC is that you can actually read the RFC and get fairly a good understanding of it in less than a day. I'm not sure if the same can be said for RDP.
Microsoft has this page where you can access a bunch of PDF documents that describe RDP in some sense, but I've no idea where I would even start and/or what's even relevant for any given problem that you might want to solve.
When I had to choose my remote desktop solution, I've tried all the FOSS ones, and they're been a disaster, without any exception.
I've tried I think a couple of VNC-based solutions (since I've tried the most common, TigerVNC was very likely one of them), and they were extremely slow - I can't remember how fast, but something like 1/2 FPS.
This left me very perplexed, because on a 10 Mbit network, with 25-ms ping, a protocol must be very badly designed (or, to be more precise, very antiquated) to have such bad performance. So IMO, the bad reputation is justified.
Other FOSS solutions weren't better, in one way or another. The other one that worked out of the box was X2Go (based on NX 3), which was still very slow (and it seems to be abandoned nowadays).
Sadly, I had to move to a closed-source solution.
Closed source solutions are so fast that sometimes I forgot that I'm on a remote machine. I'm very puzzled why there is such a large gap between open and closed source solutions.
Admittedly, because of VNC's age, there are many sub-optimal encoding methods available and many different implementations which might not be fully compatible with each other. So, maybe your chosen server has some good encoding methods available and maybe the client can support some good encoding methods, but the intersection of supported encodings might be bad.
Users of VNC often need to understand how to tweak their settings for the best result. The client should choose the "best" encoding automatically, but this cannot be fully relied upon. On a slow network (like yours), it's probably best to select "tight" encoding, allow lossy compression (jpeg) and turn down the quality a bit.
That is to say, if the implementation that you're using doesn't support h264. Currently, there are not many that do.
At least some commercial remote desktop solutions are very strict about business usage - they don't allow it at all (last time I've checked, Teamviewer was so).
Nomachine is a bit looser: they allow occasion business usage, as long as it's not the core of the remote workflow.
It certainly has its place, but as a remote desktop solution it is the simplest (and stupidest) way that functionality could possibly be implemented. VNC is what you use if you want to copy whatever is on the screen of the remote system, RDP is what you use if you want your displays connected to a remote system.
RDP lets you use all of the client's monitors, printers, clipboard, audio, USB devices, etc. as though they were plugged in to the host. The clipboard sharing extends to files, so you can drag and drop a file from the client to the host via the RDP window. If you lock your workstation at work with a bunch of running programs, drive home, and RDP into it, all of the applications will be moved to your RDP session and rearranged to fit your display(s). The next day you log back in at work and they get moved back to the local displays.
Oh, and did I mention all of this works well even on crap connections? I have been at jobs where a remote branch has such shit internet that they RDP into a VM at the head office to browse the internet because RDP uses less bandwidth than the sites they are browsing.
Free VNC implementations often lack modern features that make the protocol usable. I've yet to find the first VNC interaction where my password can be longer than 8 characters. Everyone seems to just tunnel VNC over SSH instead of implementing modern authentication protocols which adds another layer of complexity and limitations to a latency sensitive, high-bandwidth protocol.
VNC is definitely easier to understand but the common servers usually leave a bad taste in my mouth after using them. RDP clients are often higher quality but there are many issues with (configuring) RDP servers right for Linux; however, GNOME's direct RDP integration has solved a lot of problems I was having before.
Protocol-wise I'm aware, but it seems a better fit technically for windows than x11. Making a performant client for linux should be doable, but creating a server that takes advantage of all the protocol features sounds like a much greater effort.
And yes, it seems in general something was wrong in your vnc setup. I was doing fullhd back then and typing or moving smaller Windows was near instant, maybe 100ms delay if the window was larger or the screen busy. Even video playback in YouTube was smooth (with occasionally dropped frames) as long as I kept it at the default size, which was surprising.
On another note, a few days ago for reasons I had to access a machine at work running xfce. Used VNC via SSH tunnel and didn't expect too much, since last I checked the protocol was very sensitive to latency, but it was a pretty smooth experience still. Maybe those tigervnc folks are still busy tweaking it.
FreeRDP3 supports H264 video encoding. FreeRDP2 has RemoteFX (non-progressive). Microsoft is using a customized libfreerdp, weston, and pulseaudio stack in WSL, to allow Linux apps to run in Windows. Xrdp supports H264 and audio. Performant Linux RDP servers and client components are available now, but you have to compile from source. Nothing wrong with VNC for remote maintenance, I still use it for that too.
Er, no. You just need to install xorgxrdp on modern Fedora and Ubuntu. On Fedora 37, you can also install xorgxrdp-glamor for GPU acceleration. Only thing I had to add was the pulse audio plugin, but I think that’s being packaged now.
As a modern alternative, KasmVNC uses webp for encoding. The issue with H264 is that it needs to send the whole image every so often which is useful when seeking in a video feed, but pretty much useless for a live only desktop feed.
Edit: just seen that adding H264 is on the roadmap for KasmVNC
Not at all. I use VNC as well. It's great for remote administration. In my testing VNC was literally unusable with this virtual monitor setup, some kind of bug reading the headless display tanks the performance beyond practicality.
It was already pretty fast with plain xorgxrdp, but using xorgxrdp-glamor made the server side _way_ faster. RDP as a protocol blows VNC out of the water, and can use x264 as well.
Yes, it is my goto choice if I want near real time display support for vms and docker. It even has gpu driver support enabling 2D/3D acceleration. VNC is SO much slower.
Funny enough I thought the same thing. RDP has to be slower on Linux compared to VNC. It was just by accident that I discovered how freakishly fast it is.
You also need to make sure you're running xorgxrdp and not just the xrdp listener, which by default proxies to a Xvnc in many distros, thereby wasting CPU.
As USB-C becomes more and more ubiquitous, I'd love to see every device with a USB-C port and a display to support incoming DisplayPort alternate mode. That would be a worthy successor to Apples target display mode and would offer a great way for using devices like tablets as secondary displays.
That would be super expensive from an electrical BOM point of view. Laptops and phones are simply not designed to input video. Closest you might get is USB gadget (also pretty tricky) and do display over that (ala displaylink).
I wonder if reconfigurable FPGAs could be the answer? They could be reprogrammed at runtime to hardware-accelerate whatever the host system wants without having to waste space on discrete hardware around for use-cases that may never be required by the host system.
For many displays you can buy relatively simple controller boards from AliExpress that take HDMI or similar. They can be a great for reusing old laptop hardware that you no longer use.
I don't think manufacturers will bother implementing extra hardware just in case their devices die on you. It'd be great, but it's very much a niche use case.
Nice, he's using pipewire and gstreamer to send a video feed. Come to think of it, ffmpeg is capable of recording the xsession and streaming output via HTTP. It should be possible to connect to that with VLC or maybe even a web browser. Will try that.
This should be done via MiraCast.
And then the same system would be compatible with smart tvs, smartphones, Samsung Dex and Windows.
For Linux the project implementing the protocol is called MiracleCast. There is a GTK Gui to send your screen. But there is no GUI to receive a screen.
Unfortunately this gets little love.
Also, to my understanding, Wayland is still not ready for such a use case.
Miracast is, if I'm not mistaken, wireless only which in my mind makes it pretty darn useless.
At least since all wireless implementations I've tried have been very flaky and after the first the initial "this is pretty cool", immediately followed by "ouch, the input lag".
Which then is followed by flaky connections and if on a phone ridiculous battery drain. It is nothing but a source of frustration and waste of time so anytime I hear the word I just tune out.
It is important to note that Miracast over Infrastructure is not a replacement for standard Miracast. Instead, the functionality is complementary, and provides an advantage to users who are part of the enterprise network.
Discovery requests to identify Miracast receivers can only occur through the Wi-Fi adapter. Once the receivers have been identified, Windows 10 can then attempt the connection to the network.
I couldn't get MiracleCast to work for this use case. Through pain and effort I managed to share a single screen, but that's about it. Sound didn't work, the latency was terrible (literal seconds of input latency) and input had issues.
I'd love to see the project mature but right now I don't think it's usable.
what use case? Miracast, or TFA use-case? Under sway, it's very easy to:
* add a virtual screen of specified resolution (The command is `swaymsg create_output`, then adjust its position and resolution if the default 1080p on the right are not to your liking)
* start a vnc server for that screen (`wayvnc -o HEADLESS-1`)
* start a vnc client on the laptop
Regarding screencast, xdg portals and pipewire streams are a thing (including zero-copy DMA-BUF access, and hardware encoding).
On wayvnc git master and sway 1.8 (or git master), you can script things so that a "virtual" display gets created automatically when someone connects to VNC, and removed when they disconnect.
Well, this is also a fallacy of yours (strawman?), I was countering the argument of:
> Also, to my understanding, Wayland is still not ready for such a use case.
With an example showing how to do it.
To answer your point, this wouldn't faze any sway user, and it is perfectly scriptable, you can add a button or script to perform the three server-side steps. Of course, gnome includes a button to cast to a nearby miracast-enabled screen.
I'm not answering to my mother, I'd set it up for her with a nice shortcut. And she probably wouldn't be using sway.
You also need a complex list of tasks for miracast.
* Turn on the TV (simple enough, but since we're counting steps...)
* Make sure the miracast option is enabled, and the TV is discoverable. That one is impossible to explain over the phone -- unlike a command line -- as every TV burries this under a different menu.
* Find the right button on the "client" PC (might require enabling an option before?) and click it.
I've been supporting my father's computer for about 15 years now. It's the only machine he uses. He doesn't have root. He doesn't use a command line. I used to have an autossh connection for his machine to talk to one of my servers, and now we use Wireguard instead, for the rare occasions (about once a year) that he needs something looked at.
Support is basically zero these days. He uses Chrome and Firefox and LibreOffice. He occasionally fires up the Simon Tatham Portable Puzzles collection. I have uBlock Origin installed on both browsers. I installed XFCE on day one, spent a couple of hours demonstrating things, and then left.
Every five or six years I buy a new tiny, NUC-like machine, configure it up, pull his home directory across, and then send him the replacement. There's more work on the phone getting him to plug the cables in properly than anything else.
Year of the Linux desktop? It's been more than a decade.
Actually, I have zero issues across Jump Desktop, MS RDP (iOS or Android), mstsc.exe, the new Modern RDP client, and, of course, Remmina (which is actually how I'm typing this on a Raspberry Pi connected to a Fedora GPU-accelerated server...)
The top image is an excellent choice of image. Perhaps I've become too used to seeing a random visually appealing SVG at the top of articles --- but the top image here shows two laptops forming an obviously shared desktop.
Curious, seems like this is a good "Raspberry Pi + big tv" use case, and except possibly Miracast (wireless only? weird) is this one of the few projects to pull this off? Anyone know of other solutions here?
This far, FreeRDP has always required me to logout of the local X session of whatever server I'm connecting to. In order for this to be possible, it seems that logout requirement can be avoided. I'm intrigued.
Imagine a Beuwolf cluster of all your devices then you can get lots of processing power esily! Running a process on multiple computers is something that has almost become extinct, we did a lot of optimizing to get that to work in the early 2000. Lone computers just got so fast and interconnect speed is too slow.
RDP/VNC/HTML on a second screen it is, I use my sever witha
connected screen.
can... this "dummy monitor" be used in windows by some other method? not exactly the same command obviously but a way to expose display on a headless system without using those dummy hdmi plugs?
I haven't yet found a production-ready open source virtual display driver for Windows, but there are a lot of "code samples" out there that work, you just need to go through the painful process of compiling and installing them.
There's a big thread about this on the deskreen repo [0], maybe they've found something new since I last checked it.
There is, however, a very good remote virtual monitor solution called Spacedesk [1], which has a windows server (with virtual display driver included) and a JavaScript client that runs in any browser (it refuses to run in Firefox, but if you just comment out that line of code it works perfectly fine).
how about this.... i am using RDP to connect to a windows machine. i have only 1 local monitor so technically i should only see 1 remote display but if i had a way to make dummy local monitors, that would signal rdp to "use all local monitors" and give me multiple remote displays....
I don’t know anything about all this. Is there a way to do something similar on a Raspberry Pi hooked to a monitor as a second screen for a Windows laptop?
This, like many other projects trying to emulate AirDrop or AirPlay or whatever, will unfortunately never reach Apple's levels of polish because Apple actually uses the WiFi chip to seamlessly create a direct connection between the devices, whereas all these solutions need the local network to work
And being wireless it is of course a nightmare in congested areas like apartment complexes and offices. A wireless access network over which the user has no administrative control -- what were they thinking?
No the person you asked, but. I sort of feel like straight up using the second laptop is fine with Synergy. Mouse and clipboard transitions seamlessly. It's of course something different, and has it's pros and cons.
i have geographically separated machines in my office that use zerotier and have flawlessly for the last 2 years to a point i stopped using a physical LAN because the upkeep is too much....
i have a high speed internet so the performance is on par and i personally spend my work hours on rustdesk (almost 750-80 hours a week)....
my only problem with tailscale has been the oath login which is significantly different than what zerotier does. (i just create the account, use the key on any random computer and authorize from the login, meaning no auth from user side).... i know headscale but that has to be selfhosted... eh
Software used to be called Synergy, but that went commercial, so there's a fork called barrier. https://github.com/debauchee/barrier