Hacker News new | past | comments | ask | show | jobs | submit login
Use a laptop as a 2nd display on Linux using FreeRDP (jacobstoner.com)
277 points by jacob019 on Dec 16, 2022 | hide | past | favorite | 137 comments



There's a slightly different approach with a similar outcome: sharing the keyboard and mouse between two machines.

Software used to be called Synergy, but that went commercial, so there's a fork called barrier. https://github.com/debauchee/barrier


Barrier has been abandoned, the currently supported fork is now called input-leap: https://github.com/input-leap/input-leap


Wow, they have 664 open issues, 503 closed. For a software that specifically wants to keep things simple. I don't envy them.


https://symless.com/synergy/purchase

Pay $29 once or use open source software with 600 open GitHub issues... :D


For all I know, their jira backlog has 6,000 bugs in it. Just because I can't see them doesn't mean they don't exist.


Been using Synergy and recommending it for many years, paid for it a long time ago as I thought it was worth supporting given how much use I've gotten from it.

Besides being OS, what are the advantages of Barrier? Worth the (probably small amount of) time to switch?


See my comment above. I couldn't get Barrier to work, went to Synergy which worked perfectly the first try.


I paid for Synergy, but couldn't get it to work across all of the OSs I use. Synergy provided a refund, which was nice.

Barrier worked fine for my multi-OS use case (Windows, MacOS, various Linux distros).

For some reason, I had to restart Barrier periodically when I was on mesh wifi (by Vilo), but when I switched to a different wifi AP (Starlink's actually) Barrier worked consistently.


Barrier works fine. Using it now. Interface a bit clunky but so was synergy’s.


> Compatibility. We use more than one operating system and you probably do, too. Windows, OSX, Linux, FreeBSD... Barrier should "just work". We will also have our eye on Wayland when the time comes.

Wow that sounds handy indeed.


Note: Wayland already works for the "server" (the machine sharing the keyboard and mouse).


It's great. I run a desktop running Windows with WSL2 for Linux and a MacBook Pro beside it. With some combination of a shared mouse, keyboard, clipboard, SSH, and SSHFS / NFS the experience is pretty seamless.


Barrier user for over 7 years across OSX, WIN, and Linux. It beats all other solutions. Light, open source, and reliable workhorse. Just get it.


Same here. I'm using it now to type this. Comes with pretty much every Linux distro, just get the matching version for Windows or Mac and you're done.


I'm on Windows 10/11 here, and I tried Barrier, having used Synergy so long ago I was using CRT's and wasn't out of date. I could not get Barrier configured. It just didn't work. I ended up just purchasing Synergy for $30, and it worked immediately, and now I have my 15.6" laptop with a 11" Surface Go sitting next to it, and can copy/paste between them and use the Surface like a second monitor, but it's actually a second computer with its own processing power, which is pretty nice.


If you are on Windows Mouse Without Borders is free and works great over wired connection

https://www.microsoft.com/en-us/download/details.aspx?id=354...


Ah, so much software is available! But, I'm not on any wired connections. My primary use case for Synergy is mobile.


Definitely felt like a pretty low cost. I've worked with their support briefly also which helped me resolve an issue I was having (been so long ago now I honestly don't remember what it was). I'm literally typing on my Windows PC from my Mac laptop right now. I used to use it a lot for testing Windows browsers (less needed now that IE is dead and Edge is Chromium).

The only issue I really have now is once in a while I can't use my keyboard on the client computer(s), and that is because my server is currently my work Mac, and the terminal will occasionally request secure lock (probably when I use sudo) and as part of that, the keyboard is not allowed to be accessed by programs like Synergy until the lock is released. But Mac gets kinda buggy once in a while and doesn't properly release the lock.

Most of the time it just works though, and I've gotten way more than my $30 worth at this point.


Can someone comment on Barrier vs input-leap? (https://github.com/input-leap/input-leap?)

It looks like input-leap has drag-drop in the works, but apart from that I have no clue which one is the blessed one right now.


Input Leap is an active fork of Barrier. It looks like Barrier is unmaintained.

https://github.com/input-leap/input-leap/issues/1414


I've given a squint eyed sideways glance at the forks ever since part of the forking contingent tried to reference the author's username "debauchee" as a reason it should be forked, because the name could be seen as offensive.

"The name debauchee is not professional and should not be associated to a popular project."

From the github thread linked above. How disgraceful.


I helped my friend set this up back in the day so he could have WoW running on one computer and the wiki open on the other. IIRC he didn't have enough RAM on his main computer so running FireFox in the background tanked his FPS.


What about x2x?

It works for me across a laptop and an RPI.

Or if you're using Wayland, I'm sure they have an even better solution that has been around for negative three months already. :)


Anyone used MicrosoftGarage Mouse Without Borders and can provide some comparison based on experience / tips ?

I have used Mouse Without Borders across work and personal Windows machines -- mostly works with occassional quirks. I think there is some latency though when mouse input is routed through another PC.


Logitech Flow (free) has that feature, but I don't know how well it works. I only use it for customizing my MX Vertical.

https://www.logitech.com/en-us/software/options.html


I use Barrier all the time, but it has a tendency to lock up on OSX and I have to use a physical kb/mouse switch to go fix it occasionally. (Or I could remote in with a program, I guess.)


I have been using Barrier or its predecessor Synergy for 20 years. Fantastic stuff.


I am surprised no one has mentioned the new Gnome sharing Extend feature[1]. It also uses RDP, but you enable it and it "just works," including input events from the remote device onto the virtual screen. I've used Synergy/Barrier/Waynergy for years, but this is a good way to add a screen anywhere that supports RDP (Android, ChromeOS, etc). My main reason to explore it is to extend to a nice high resolution OLED tablet. Latency is quite low, as long as you don't try to watch a high resolution video.

1. https://www.reddit.com/r/gnome/comments/uz5as7/gnome_has_mad... (discussion)


I was very excited to read this but was it removed in Gnome 43.x?

> gsettings set org.gnome.desktop.remote-desktop.rdp screen-share-mode extend

> No such schema “org.gnome.desktop.remote-desktop.rdp”


I think it might have been renamed, that seems like a feature flag.


This sounds great! Will give this a try.

I feel like all the parts of linux are there for an amazing desktop(its already great), its just sometimes the discoverability/documentation/usability need a little push.


I've had trouble accessing Ubuntu over Gnome RDP sharing, except from a Windows machine. I can access Windows machines via RDP using e.g. Remmina, but accessing an Ubuntu host only seems to work for me from Windows regardless of the client I use.


Hmm I've tried it from an Ubuntu and Android (Samsung) client, both seemed to work fine. Maybe a network thing?


What configuration did you use in the client?


Nothing special. I read the Microsoft RDP client is the best one for Android, so just used its defaults.


What about from Ubuntu?


Just the defaults on 22.04/22.10. Have to use Wayland, though, which is still a bit of a mixed experience. The linked Reddit thread has more details.


Like, what client running on Ubuntu though? Remmina? I don't mean the RDP server setup, which is pretty self explanatory in settings. I weirdly can't make Ubuntu --> Ubuntu work, but with either host or client as a Windows machine I have no problem.


Anybody have tests of the latency? I haven't found anything better than parsec that is free and open source. Parsec uses their own UDP protocol, and makes use of hardware encoder/decoder.


Moonlight as a protocol is quite cool, made for streaming games in fact. Sunshine is a open source server implementation, and the client was open source to begin with, I think. Sunshine is multi-platform, and so is Moonlight the client. I used it to stream my Linux desktop (and games) to my Android phone.

I also have good experience with Spice. I haven't found a way to set it up as a server, but Qemu uses it for all its VMs, and the performance is great, you can watch videos and everything, if you have the bandwidth.

For WiFi and slower networks, Moonlight was way more performant for me.


The problem with UDP for video is that, on a lossy network, you either end up sending a lot of key frames or reinventing TCP.

There is a trick for achieving low latency video with TCP: set SO_SNDBUF to the lowest possible value and do your own buffering. If your buffer grows too large, lower the bitrate and/or drop frames.


I'd argue that on a lossy network you will never have decent video (with decent latency) so for interactive sessions you don't really have to care about lossy network, it isn't worth the hassle anyway.


That is true, the network needs to be stable in any case, but the keyframes have "bursty" traffic, therefore increasing the latency compared to b-frames (and p-frames)

Ever since h264 there is a feature called intra-refresh, which allows you spread key frames over time, reducing the burstiness of the transmission (and therefore improving latency in most scenarios)


KasmVNC would be worth trying as that is pretty fast and low latency https://github.com/kasmtech/KasmVNC


Kasm is linux-only. Parsec for linux, android, windows, etc.


> Parsec for linux, android, windows, etc.

linux/android client only, last time I checked. Can't host on those platforms.

> Linux does not support hosting at this time and any computers on this operating system will not be listed.

https://support.parsec.app/hc/en-us/articles/4422939258893-P...


The FA is specifically about using Linux, so I didn't see any requirement for non-Linux being mentioned (until your comment).

I had a quick look at Parsec and was surprised at how expensive it is - starts at $8.33 a month.


still, worth mentioning when comparing

the frictionless RD side of Parsec is completely free. you only pay if you require >60fps, extra monitors, teams, etc.


Oh yeah - I didn't scroll down far enough to see the Personal Use option for free


is there any performance comparison with other VNC/RDP solutions?


I found this page about their internal benchmarking, but that would be more comparing between versions of KasmVNC https://github.com/kasmtech/KasmVNC/wiki/Performance-Testing


Could be a problem if you need to encode other things, since hardware encoders are artificially limited on nvidia and probably others. Nvidia did increase the amount of concurrent streams at the start of the pandemic though and the limit may also be partly imposed due to patent licensing and not just Nvidia's price discrimination.


What parsec? The parsec.app I know seems to be fully closed source...


I think the sentence was intended to be read as OP being unable to find something that's both better than parsec and FOSS


If OBS and WebRTC can get to ~120ms, it seems like there are some low-hanging fruits here


Parsec is <2ms


When I used Parsec to control a machine on the other side of my room, the monitor and my client were not even 1 frame out of sync. The only downside is the heavy compression to achieve that. There is also no color accuracy.


Has freerdp gotten so much better? About 5 years ago, tigervnc was much smoother than freerdp when I did the same thing. It almost felt like freerdp was just a translation layer on top of vnc.


As a protocol RDP is way ahead of VNC. The latest versions support H264 video encoding and AAC audio. With a beefy Windows server and FreeRDP as a client, I can stream 1080P YouTube from a remote desktop over the Internet at 60FPS. RFX progressive encoding isn't bad either. The trouble with FreeRDP, at least the older versions, is that you have to explicitly enable these features. I tested VNC in this setup, and it was unusable, about 1s frame rate. Something about this multi-monitor setup just doesn't play well with the VNC software that I tested--x11vnc server and tigervnc client, and I have used those often in other scenarios.


https://github.com/rfbproto/rfbproto/blob/master/rfbproto.rs...

I feel that the VNC protocol gets a bad reputation because of bad implementations. There's really nothing in the protocol itself that keeps it back.

What I like about VNC is that you can actually read the RFC and get fairly a good understanding of it in less than a day. I'm not sure if the same can be said for RDP.

Microsoft has this page where you can access a bunch of PDF documents that describe RDP in some sense, but I've no idea where I would even start and/or what's even relevant for any given problem that you might want to solve.


When I had to choose my remote desktop solution, I've tried all the FOSS ones, and they're been a disaster, without any exception.

I've tried I think a couple of VNC-based solutions (since I've tried the most common, TigerVNC was very likely one of them), and they were extremely slow - I can't remember how fast, but something like 1/2 FPS.

This left me very perplexed, because on a 10 Mbit network, with 25-ms ping, a protocol must be very badly designed (or, to be more precise, very antiquated) to have such bad performance. So IMO, the bad reputation is justified.

Other FOSS solutions weren't better, in one way or another. The other one that worked out of the box was X2Go (based on NX 3), which was still very slow (and it seems to be abandoned nowadays).

Sadly, I had to move to a closed-source solution.

Closed source solutions are so fast that sometimes I forgot that I'm on a remote machine. I'm very puzzled why there is such a large gap between open and closed source solutions.


10 Mb/s network? Was this 20 years ago?

Admittedly, because of VNC's age, there are many sub-optimal encoding methods available and many different implementations which might not be fully compatible with each other. So, maybe your chosen server has some good encoding methods available and maybe the client can support some good encoding methods, but the intersection of supported encodings might be bad.

Users of VNC often need to understand how to tweak their settings for the best result. The client should choose the "best" encoding automatically, but this cannot be fully relied upon. On a slow network (like yours), it's probably best to select "tight" encoding, allow lossy compression (jpeg) and turn down the quality a bit.

That is to say, if the implementation that you're using doesn't support h264. Currently, there are not many that do.


> 10 Mb/s network? Was this 20 years ago?

Typo :) It's 100 Mbit.


Who is this closed-source solution?


Nomachine.

At least some commercial remote desktop solutions are very strict about business usage - they don't allow it at all (last time I've checked, Teamviewer was so).

Nomachine is a bit looser: they allow occasion business usage, as long as it's not the core of the remote workflow.


NoMachine would be perfect if it streamed macOS servers at full resolution.

No, I do not want to view my beautiful Retina MBP's screen on my 4K laptop at... 1440x900.


EDIT: 100 mbit, not 10 mbit!


VNC gets a bad rap because it is not very good.

It certainly has its place, but as a remote desktop solution it is the simplest (and stupidest) way that functionality could possibly be implemented. VNC is what you use if you want to copy whatever is on the screen of the remote system, RDP is what you use if you want your displays connected to a remote system.

RDP lets you use all of the client's monitors, printers, clipboard, audio, USB devices, etc. as though they were plugged in to the host. The clipboard sharing extends to files, so you can drag and drop a file from the client to the host via the RDP window. If you lock your workstation at work with a bunch of running programs, drive home, and RDP into it, all of the applications will be moved to your RDP session and rearranged to fit your display(s). The next day you log back in at work and they get moved back to the local displays.

Oh, and did I mention all of this works well even on crap connections? I have been at jobs where a remote branch has such shit internet that they RDP into a VM at the head office to browse the internet because RDP uses less bandwidth than the sites they are browsing.

RDP is one of the things I miss using Linux.


Free VNC implementations often lack modern features that make the protocol usable. I've yet to find the first VNC interaction where my password can be longer than 8 characters. Everyone seems to just tunnel VNC over SSH instead of implementing modern authentication protocols which adds another layer of complexity and limitations to a latency sensitive, high-bandwidth protocol.

VNC is definitely easier to understand but the common servers usually leave a bad taste in my mouth after using them. RDP clients are often higher quality but there are many issues with (configuring) RDP servers right for Linux; however, GNOME's direct RDP integration has solved a lot of problems I was having before.


Protocol-wise I'm aware, but it seems a better fit technically for windows than x11. Making a performant client for linux should be doable, but creating a server that takes advantage of all the protocol features sounds like a much greater effort.

And yes, it seems in general something was wrong in your vnc setup. I was doing fullhd back then and typing or moving smaller Windows was near instant, maybe 100ms delay if the window was larger or the screen busy. Even video playback in YouTube was smooth (with occasionally dropped frames) as long as I kept it at the default size, which was surprising.

On another note, a few days ago for reasons I had to access a machine at work running xfce. Used VNC via SSH tunnel and didn't expect too much, since last I checked the protocol was very sensitive to latency, but it was a pretty smooth experience still. Maybe those tigervnc folks are still busy tweaking it.

Will try freerdp next time :-)


FreeRDP3 supports H264 video encoding. FreeRDP2 has RemoteFX (non-progressive). Microsoft is using a customized libfreerdp, weston, and pulseaudio stack in WSL, to allow Linux apps to run in Windows. Xrdp supports H264 and audio. Performant Linux RDP servers and client components are available now, but you have to compile from source. Nothing wrong with VNC for remote maintenance, I still use it for that too.


Er, no. You just need to install xorgxrdp on modern Fedora and Ubuntu. On Fedora 37, you can also install xorgxrdp-glamor for GPU acceleration. Only thing I had to add was the pulse audio plugin, but I think that’s being packaged now.


As a modern alternative, KasmVNC uses webp for encoding. The issue with H264 is that it needs to send the whole image every so often which is useful when seeking in a video feed, but pretty much useless for a live only desktop feed.

Edit: just seen that adding H264 is on the roadmap for KasmVNC


am I understanding this? you say some stable tech is "unusable" because it does not do local video streaming at 60fps?

Secondly none of the computers involved for me have two monitors attached..


Not at all. I use VNC as well. It's great for remote administration. In my testing VNC was literally unusable with this virtual monitor setup, some kind of bug reading the headless display tanks the performance beyond practicality.


Here’s a video of GPU accelerated Xrdp on a Fedora machine being used with Remmina on a Raspberry Pi:

https://twitter.com/rcarmo/status/1584627462939492354?s=61&t...

It was already pretty fast with plain xorgxrdp, but using xorgxrdp-glamor made the server side _way_ faster. RDP as a protocol blows VNC out of the water, and can use x264 as well.


Yes, it is my goto choice if I want near real time display support for vms and docker. It even has gpu driver support enabling 2D/3D acceleration. VNC is SO much slower.

Funny enough I thought the same thing. RDP has to be slower on Linux compared to VNC. It was just by accident that I discovered how freakishly fast it is.


OP mentions that at the end of the article:

In my testing, VNC is not a suitable replacement for FreeRDP in this setup--it's too slow.


I tried it recently, it still felt like x11vnc wrapped in RDP.


You need to enable codecs, at least RemoteFX.


You also need to make sure you're running xorgxrdp and not just the xrdp listener, which by default proxies to a Xvnc in many distros, thereby wasting CPU.


As USB-C becomes more and more ubiquitous, I'd love to see every device with a USB-C port and a display to support incoming DisplayPort alternate mode. That would be a worthy successor to Apples target display mode and would offer a great way for using devices like tablets as secondary displays.


That would be super expensive from an electrical BOM point of view. Laptops and phones are simply not designed to input video. Closest you might get is USB gadget (also pretty tricky) and do display over that (ala displaylink).


I wonder if reconfigurable FPGAs could be the answer? They could be reprogrammed at runtime to hardware-accelerate whatever the host system wants without having to waste space on discrete hardware around for use-cases that may never be required by the host system.


> Laptops and phones are simply not designed to input video.

OK, but you get PCIe passthrough with Thunderbolt, so you can put a capture card in an eGPU enclosure, assuming you have TB.


For many displays you can buy relatively simple controller boards from AliExpress that take HDMI or similar. They can be a great for reusing old laptop hardware that you no longer use.

I don't think manufacturers will bother implementing extra hardware just in case their devices die on you. It'd be great, but it's very much a niche use case.


The author of TurboVNC conducted a study (2014) on using H.264: https://turbovnc.org/About/H264


See also https://news.ycombinator.com/item?id=31409010 -> https://tuxphones.com/howto-linux-as-second-wireless-display... which does a decent amount of thought around optimizing latency.

Worth noting gnome 42+ also has an extend display over vnc setting , it may be hidden by default.


Nice, he's using pipewire and gstreamer to send a video feed. Come to think of it, ffmpeg is capable of recording the xsession and streaming output via HTTP. It should be possible to connect to that with VLC or maybe even a web browser. Will try that.


I get that you can do it using FreeRDP, but why??

This should be done via MiraCast. And then the same system would be compatible with smart tvs, smartphones, Samsung Dex and Windows.

For Linux the project implementing the protocol is called MiracleCast. There is a GTK Gui to send your screen. But there is no GUI to receive a screen.

Unfortunately this gets little love.

Also, to my understanding, Wayland is still not ready for such a use case.


Miracast is, if I'm not mistaken, wireless only which in my mind makes it pretty darn useless.

At least since all wireless implementations I've tried have been very flaky and after the first the initial "this is pretty cool", immediately followed by "ouch, the input lag".

Which then is followed by flaky connections and if on a phone ridiculous battery drain. It is nothing but a source of frustration and waste of time so anytime I hear the word I just tune out.


> Miracast is, if I'm not mistaken, wireless only

Wow, that's an insane software design choice.

edit: looks like there's support for Miracast over Ethernet at least in Windows: https://learn.microsoft.com/en-us/surface-hub/miracast-over-...


Unfortunately it still requires a Wi-Fi adapter:

    It is important to note that Miracast over Infrastructure is not a replacement for standard Miracast. Instead, the functionality is complementary, and provides an advantage to users who are part of the enterprise network.

    Discovery requests to identify Miracast receivers can only occur through the Wi-Fi adapter. Once the receivers have been identified, Windows 10 can then attempt the connection to the network.


I couldn't get MiracleCast to work for this use case. Through pain and effort I managed to share a single screen, but that's about it. Sound didn't work, the latency was terrible (literal seconds of input latency) and input had issues.

I'd love to see the project mature but right now I don't think it's usable.


what use case? Miracast, or TFA use-case? Under sway, it's very easy to:

* add a virtual screen of specified resolution (The command is `swaymsg create_output`, then adjust its position and resolution if the default 1080p on the right are not to your liking)

* start a vnc server for that screen (`wayvnc -o HEADLESS-1`)

* start a vnc client on the laptop

Regarding screencast, xdg portals and pipewire streams are a thing (including zero-copy DMA-BUF access, and hardware encoding).


On wayvnc git master and sway 1.8 (or git master), you can script things so that a "virtual" display gets created automatically when someone connects to VNC, and removed when they disconnect.

See https://github.com/any1/wayvnc/pull/200/files

The script in the PR does something a bit different, but it's only an example and can be modified to do what I described in the first paragraph.


    it's very easy to
    [complex list of tasks]
This is the typical HN fallacy.

Go try to explain your mother over the phone how to edit a sway configuration to 'add a virtual screen of specified resolution'.

Compare that to clicking on the cast icon and selecting a TV.


Well, this is also a fallacy of yours (strawman?), I was countering the argument of:

> Also, to my understanding, Wayland is still not ready for such a use case.

With an example showing how to do it.

To answer your point, this wouldn't faze any sway user, and it is perfectly scriptable, you can add a button or script to perform the three server-side steps. Of course, gnome includes a button to cast to a nearby miracast-enabled screen.

I'm not answering to my mother, I'd set it up for her with a nice shortcut. And she probably wouldn't be using sway.

You also need a complex list of tasks for miracast.

* Turn on the TV (simple enough, but since we're counting steps...)

* Make sure the miracast option is enabled, and the TV is discoverable. That one is impossible to explain over the phone -- unlike a command line -- as every TV burries this under a different menu.

* Find the right button on the "client" PC (might require enabling an option before?) and click it.


It's doable. Not over the phone but you can write a shell script and put an icon on the desktop.

source: spent years supporting Linux for my tech-illiterate parents.


I've been supporting my father's computer for about 15 years now. It's the only machine he uses. He doesn't have root. He doesn't use a command line. I used to have an autossh connection for his machine to talk to one of my servers, and now we use Wireguard instead, for the rare occasions (about once a year) that he needs something looked at.

Support is basically zero these days. He uses Chrome and Firefox and LibreOffice. He occasionally fires up the Simon Tatham Portable Puzzles collection. I have uBlock Origin installed on both browsers. I installed XFCE on day one, spent a couple of hours demonstrating things, and then left.

Every five or six years I buy a new tiny, NUC-like machine, configure it up, pull his home directory across, and then send him the replacement. There's more work on the phone getting him to plug the cables in properly than anything else.

Year of the Linux desktop? It's been more than a decade.


There's an experimental TurboVNC server fork with h264 encoding support:

https://github.com/faust93/turbovnc

But it has to be used in pair with TigerVNC viewer only, because AFAIK there're no other viewer implementations with h264 decoding support


FreeRDP and Microsoft Remote Desktop both support H264, but the RDP protocol is complex and it's easy to break compatibility between clients.


Actually, I have zero issues across Jump Desktop, MS RDP (iOS or Android), mstsc.exe, the new Modern RDP client, and, of course, Remmina (which is actually how I'm typing this on a Raspberry Pi connected to a Fedora GPU-accelerated server...)


I tried the same thing. The performance is a LOT better using Parsec, it's free for personal use


Aren't only clients aupported for Parsec on Linux?


I'm not sure sorry, I was using my macbook from my linux, so maybe


The top image is an excellent choice of image. Perhaps I've become too used to seeing a random visually appealing SVG at the top of articles --- but the top image here shows two laptops forming an obviously shared desktop.


Curious, seems like this is a good "Raspberry Pi + big tv" use case, and except possibly Miracast (wireless only? weird) is this one of the few projects to pull this off? Anyone know of other solutions here?


This far, FreeRDP has always required me to logout of the local X session of whatever server I'm connecting to. In order for this to be possible, it seems that logout requirement can be avoided. I'm intrigued.


They set up additional virtual displays, and older Xrdp servers also used to mirror the hardware framebuffer, so no need to logout.


People used to use x2x for this: https://github.com/dottedmag/x2x


I was meaning to find a method of correcting excessive clouding on my laptop screen and RDPing to the same device sounded like a good idea.

Alternatively there's ShaderGlass, but I would have to modify it to work with an image mask.

If anyone has a better idea for Windows that doesn't involve hardware changes I would be much obliged.


Why would you only use it as a second screen when instead you can double your processing power too?


Imagine a Beuwolf cluster of all your devices then you can get lots of processing power esily! Running a process on multiple computers is something that has almost become extinct, we did a lot of optimizing to get that to work in the early 2000. Lone computers just got so fast and interconnect speed is too slow.

RDP/VNC/HTML on a second screen it is, I use my sever witha connected screen.


There are a lot of approaches to this across all platforms including Windows;

https://alternativeto.net/software/maxivista/


can... this "dummy monitor" be used in windows by some other method? not exactly the same command obviously but a way to expose display on a headless system without using those dummy hdmi plugs?


I haven't yet found a production-ready open source virtual display driver for Windows, but there are a lot of "code samples" out there that work, you just need to go through the painful process of compiling and installing them. There's a big thread about this on the deskreen repo [0], maybe they've found something new since I last checked it.

There is, however, a very good remote virtual monitor solution called Spacedesk [1], which has a windows server (with virtual display driver included) and a JavaScript client that runs in any browser (it refuses to run in Firefox, but if you just comment out that line of code it works perfectly fine).

[0] https://github.com/pavlobu/deskreen/discussions/86 [1] https://www.spacedesk.net/


You can write an IndirectDisplay class driver to expose virtual monitors


Probably. That’s worth a shot. Any idea where the base64 stuff came from in the guide?


That is the EDID data for the dummy monitor, it a small binary that contains resolutions and timings pulled from a real monitor.


i mean windows is notorious for having a physical monitor attached. those dummy plugs help but not everyone has one...


> but a way to expose display on a headless system

... RDP?


how about this.... i am using RDP to connect to a windows machine. i have only 1 local monitor so technically i should only see 1 remote display but if i had a way to make dummy local monitors, that would signal rdp to "use all local monitors" and give me multiple remote displays....

same for remote viewers


Is there a solution to use an old Android 4.4 tablet as an external screen for a linux machine?

The mentioned aFreeRDP seems to require at least Android 5.0.


I don’t know anything about all this. Is there a way to do something similar on a Raspberry Pi hooked to a monitor as a second screen for a Windows laptop?


Yes. You can follow the instructions and use Remmina as a client. Here’s my Pi 3 as a thin client: https://twitter.com/rcarmo/status/1584627462939492354?s=61&t...


This, like many other projects trying to emulate AirDrop or AirPlay or whatever, will unfortunately never reach Apple's levels of polish because Apple actually uses the WiFi chip to seamlessly create a direct connection between the devices, whereas all these solutions need the local network to work


And being wireless it is of course a nightmare in congested areas like apartment complexes and offices. A wireless access network over which the user has no administrative control -- what were they thinking?


This is an insanely janky, ugly mess of a solution.


It's why it's so fascinating.


Do you have a more elegant solution to share?


No the person you asked, but. I sort of feel like straight up using the second laptop is fine with Synergy. Mouse and clipboard transitions seamlessly. It's of course something different, and has it's pros and cons.



Nice hack but 'Single head is all you need'. It's more comfortable and efficient and there's a nice suckless Video in that on yt.


Is there a way to do this in windows?


TailScale powered Sidecar anyone?


i have geographically separated machines in my office that use zerotier and have flawlessly for the last 2 years to a point i stopped using a physical LAN because the upkeep is too much....

i have a high speed internet so the performance is on par and i personally spend my work hours on rustdesk (almost 750-80 hours a week)....

my only problem with tailscale has been the oath login which is significantly different than what zerotier does. (i just create the account, use the key on any random computer and authorize from the login, meaning no auth from user side).... i know headscale but that has to be selfhosted... eh

tailscale looks good tech without that imo




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: