X11 is a bit problematic when it comes to compositing. This is nicely explained on the Wayland architecture page.
The solution is that Wayland requests that every client renders itself in a bitmap and the Wayland server creates a composite image from that. So far so good but I’m really unhappy that Wayland drops support for one of the best features of X11: the remote display.
X11 tried really hard to separate rendering from the application. This allowed lean clients and sending rendering commands (draw a line here, a rectangle there) over an optimized network protocol to the server which displays it to the user. This was priceless in the early days of computing when clients were thin and CPU power was expensive: You would run your application on a server and just get a bunch of rendering commands on your local display (which was basically a graphics card connected to an interpreter for X11 rendering commands which it received over the ‘net).
Other strategies, like copying part of the bitmap which has changed to the server, are much more expensive. A 32×32 pixel image in true color needs 4*32*3 = 384 bytes. That is the same amount you need to render 48 lines or rectangles (using 16-bit coordinates – how big is your screen?). On top of that, rendering commands have a much better compression rate (since there are lots of zero-bits) than images.
While I understand the motivation behind Wayland, I’m not happy with the implementation.
- Ubuntu Will Adopt Wayland Graphics System (pcworld.com)
- Linux beyond X: Shuttleworth contemplates Wayland (arstechnica.com)