In response to Wayland Breaks Your Bad Software

I say that the technical merits are irrelevant because I don’t believe that they’re a major factor any more in most people moving or not moving to Wayland.

With only a slight amount of generalization, none of these people will be moved by Wayland’s technical merits. The energetic people who could be persuaded by technical merits to go through switching desktop environments or in some cases replacing hardware (or accepting limited features) have mostly moved to Wayland already. The people who remain on X are there either because they don’t want to rebuild their desktop environment, they don’t want to do without features and performance they currently have, or their Linux distribution doesn’t think their desktop should switch to Wayland yet.

  • Sh1nyM3t4l4ss@lemmy.world
    link
    fedilink
    arrow-up
    74
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I switched to Wayland over two years ago and these days I don’t look back at all. I don’t care if Wayland has full feature parity with X11 as long the features I actually use are supported which they are.

    Clipboard sharing in VirtualBox doesn’t work right now (though I’m relatively sure it could be implemented by VirtualBox right now with Wayland as it is) and neither does AutoTyping in KeePassXC (not sure if there’s a mechanism for that on Wayland), though Autofill in the Browser works so it’s no big deal to me.

    In return I get 1:1 touch gestures, better multi monitor support and an overall smoother desktop on Plasma Wayland so I’ll take it.

    People often still make complaints about Wayland that have been fixed months or years ago and it’s a bit tiring.

      • Pasta Dental@sh.itjust.works
        link
        fedilink
        arrow-up
        20
        ·
        1 year ago

        Yeah? Things like having a 60hz monitor and a 120hz monitor is basically non existent on X11, plus Wayland has this “perfect frame every time” + vsync philosophy which means no tearing and it feels much smoother to use than X11

    • marty_relaxes@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      On the topic of auto-typing, the mechanisms for variations of it exist in Wayland since I am using it in my password scripts to automatically fill login boxes. (Using tools like ydotool or wtype.)

      So I would guess that KeePass hasn’t integrated the necessary protocols/api for Wayland?

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    46
    arrow-down
    1
    ·
    1 year ago

    The ability to have multiple displays at different scales is a godsend when trying to use a laptop with a 4k display connected to 1080p monitors or vice versa

        • michaelrose@lemmy.ml
          link
          fedilink
          arrow-up
          6
          arrow-down
          8
          ·
          edit-2
          1 year ago

          I just passed scale to xrandr after computing the proper scale and then used the nvidia-settings gui to write current configuration to xorg.conf its not incredibly hard basically all you are doing is scaling lower DPI items up to the same resolution as your highest dpi item and letting it scale down the correct physical size. For instance if you have 27’ monitors that are 4K and 1080p you just scale the 1080 ones by 2 if you have a 4k 27 and a 1080 24" its closer to 1.75. The correct ratio can be found with your favorite calculator app.

          You can set this scaling directly in nvidia-settings come to think of it where you set viewport in and viewport out.

          • LaggyKar@programming.dev
            link
            fedilink
            arrow-up
            27
            arrow-down
            2
            ·
            1 year ago

            That’s not at all the same thing. That requires downscaling some screens, which makes everything blurry and breaks subpixel AA.

            • cobra89@beehaw.org
              link
              fedilink
              arrow-up
              6
              ·
              1 year ago

              Yeah, wherever someone says “X has/has had fractional scaling” I just ignore them because it’s never actually true fractional scaling that doesn’t look and act like utter crap.

              • pwnna@lemmy.ca
                link
                fedilink
                arrow-up
                3
                ·
                1 year ago

                It also tears significantly in my experience, which is pretty unusable…

              • michaelrose@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                I know you live in this weird universe where the screen that is 12 inches from my face actually looks like crap but it just isn’t so you are merely confused.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              6
              ·
              1 year ago

              It is literally how Wayland is scaling your shit you just don’t know how anything works.

              • LaggyKar@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Without the recently added wp-fractional-scale-v1, yes, it will do that if you use fractional scales (albeit per window rather than per monitor). Not however if you stick to integer scales, as they might do in the 1080p+4k use case.

      • flashgnash@lemm.ee
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        I tried unsuccessfully to get this working for quite some time and broke my xrandr settings quite a few times

        With Wayland/gnome I just click a button in the settings gui and it works flawlessly

        • michaelrose@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          With X/i3 I had to read and the result works well. With Sway I had to read and the result works poorly. So is sway better for the illiterate?

          • flashgnash@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Would you not say the best case scenario is for it to just work great straight away and not require you to read a manual or do any debugging at all just to configure your display scale?

            Also sway/i3 aren’t known to be “it just works” kinda window managers anyway they’re definitely aimed at people who like to tinker

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Sure but the thing is stuff that I have to edit files for install other programs just work with wayland out of the box.

  • 0x0@social.rocketsfall.net
    link
    fedilink
    arrow-up
    62
    arrow-down
    24
    ·
    edit-2
    1 year ago

    X11 is, to put it simply, not at all fit for any modern system. Full stop. Everything to make it work on modern systems are just hacks. Don’t even try to get away with “well, it just works for me” or “but Wayland no worky”.

    I really don’t know if there could be a more obnoxious opening than this. I guess Wayland fanatics have taken a page from the Rust playbook of trying to shame people into using it when technical merits aren’t enough (“But your code is UNSAFE!”)

      • orangeboats@lemmy.world
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I feel that the biggest mistake of X11’s protocol design is the idea of a “root window” that is supposed to cover the whole screen.

        Perhaps that worked greatly in the 1990s, but it’s just completely incompatible with multi-displays that we commonly see in modern setups. Hacks upon hacks were involved to make multi-displays a possibility on X11. The root window no longer corresponded to a single display. In heterogenous display setups, part of the root window is actually invisible.

        Later on we decided to stack compositing on top of the already-hacky mess, and it was so bad that many opted to disable the compositor (no Martha, compositors are more than wobbly windows!).

        And then there’s the problem of sandboxing programs… Which is completely unmappable to X11 even with hacks.

        • michaelrose@lemmy.ml
          link
          fedilink
          arrow-up
          5
          arrow-down
          6
          ·
          1 year ago

          Multiple displays work fine. The only thing that needs to be drawn in the root window is attractive backgrounds sized to your displays I’m not sure why you think that is hacky or complicated.

          • West Siberian Laika@lemm.ee
            link
            fedilink
            arrow-up
            8
            arrow-down
            1
            ·
            1 year ago

            Multiple displays only work as long as you have identical resolutions and refresh rates. Good luck mixing monitors with different scaling factors and refresh rates on X11.

            • Hexarei@programming.dev
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              I run multiple refresh rates without any trouble, one 165hz monitor alongside my other 60hz ones. Is that supposed to be broken somehow?

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              This wasn’t true in 2003 when I started using Linux in fact the feature is so old I’m not sure exactly when it was implemented. You have always been able to have different resolutions and in fact different scaling factors. It works like this

              You scale your lower DPI display or displays UP to match your highest DPI and let X scale down to the physical size. HIGHER / LOWER = SCALE FACTOR. So with 2 27" monitors where one is 4k and the other is 1080p the factor is 2, a 27" 4K with a 24" 1080p is roughly 1.75.

              Configured like so everything is sharp and UI elements are the same size on every screen. If your monitors are vertically aligned you could put a window between monitors and see the damn characters lined up correctly.

              If you use the soooo unfriendly Nvidia GPU you can actually configure this in its GUI for configuring your monitors. If not you can set with xrandr the argument is --scale shockingly enough

              Different refresh rates also of course work but you ARE limited to the lower refresh rate. This is about the only meaningful limitation.

      • woelkchen@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        This is not an insult to the people behind X11.

        The people behind X11 agree and that’s why they founded Wayland.

    • Static_Rocket@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      1 year ago

      No, no, they’ve got a point. The architecture of Wayland is much more sane. Because of the way refresh events are driven its also much more power and memory efficient. I’ll miss bspwm and picom but man there is a lot riding on simplifying the graphics stack under Linux. The X hacks, GLX, and all the other weird interactions X decided to take away from applications made things non-portable to begin with and a nightmare for any embedded devices that thought GLES was good enough.

    • Sh1nyM3t4l4ss@lemmy.world
      link
      fedilink
      arrow-up
      23
      arrow-down
      2
      ·
      1 year ago

      There are several remarks in that article that bothered me. I agree with their message overall and am a strong proponent of Wayland but…

      Unless your workflow (and hardware) comes from 20+ years ago, you have almost no reason to stick with Xorg

      There definitely are valid use cases that aren’t 20 years old that will keep you on X11 for a little while longer. And hardware too: NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can’t use these older cards on Wayland with the proprietary drivers

      Of course, NVIDIA likes to do their own thing, as always. Just use Nouveau if you want to do anything with Xwayland, and you don’t have several GPUs.

      Uh, no. Nouveau is not a serious option for anyone who likes using their GPU for useful things. And on those older cards it will likely never work well.

      The author of that article seems extremely ignorant of other people’s needs.

      • woelkchen@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can’t use these older cards on Wayland with the proprietary drivers

        That’s definitively the fault of people to buy NVidia hardware which only works fine on Windows. It’s not the fault of Wayland developers that NVidia is a shit company that does not care to make their hardware properly run on Linux.

        • Sh1nyM3t4l4ss@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          3
          ·
          1 year ago

          Can we stop shaming people who buy NVIDIA?

          For one, people want to keep using what they have and not buy something new just because it may work better on Linux, abd they may not even be able to afford an upgrade. They probably didn’t even know about Linux compatibility when they got it.

          And additionally, some people have to use NVIDIA because e. g. they rely on CUDA or something (which is unfortunate but not their fault).

          And honestly, NVIDIA is fine on Linux nowadays. It sucks that support for older cards will likely stay crappy forever but hopefully with the open kernel drivers and NVK newer cards won’t have to suffer that fate.

          • woelkchen@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            3
            ·
            1 year ago

            Can we stop shaming people who buy NVIDIA?

            Can people who buy NVidia hardware contrary to widespread wisdom just start to own up to their decisions and not complain about Wayland every time it is mentioned?

      • michaelrose@lemmy.ml
        link
        fedilink
        arrow-up
        7
        arrow-down
        8
        ·
        1 year ago

        The author is a Wayland fanboy which almost by definition makes them a moron. We are talking about folks who were singing the same song like 7 years ago when the crack they were promoting was outrageously broken for most use cases.

    • russjr08@outpost.zeuslink.net
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      4
      ·
      1 year ago

      I find that usually when people write “Full stop”, it’s best to just stop reading there in most cases.

      It comes off as “I am correct, how dare you think that for a moment I could be wrong”.

      I’d love to use Wayland, but until it works properly on Nvidia hardware like X11 is, then it’s not a viable option for me. Of course, then someone always goes “Well then use an AMD card” but money doesn’t grow on trees. The only reason I’m not still using a 970 is because a friend of mine was nice and gave me his 2080 that he was no longer using, along with some other really nice upgrades to my hardware.

      Honestly it’s one of the biggest issues I have with the Linux community. I love Linux and FOSS software but the people who go around and yell at anyone who isn’t using Linux, and the people who write articles like this who try to shame you for your choices (something that is supposed to be a landmark of using open source software) only make Linux look bad.

      There’s a difference between someone kindly telling others that X11 is not likely to receive any new major features and bug fixes (which is the right thing to do, in order to inform someone something they may not know) - and then there’s whatever the author of this quote is doing.

      • happyhippo@feddit.it
        link
        fedilink
        arrow-up
        11
        ·
        edit-2
        1 year ago

        It happens all the time in the magical world of closed source, too.

        Ever heard about the iOS vs Android fights? How people shame Android users for being green bubbles?

        It’s just the extension of the my camp vs theirs applied to the tech field, nothing new.

        • pelotron@midwest.social
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          1 year ago

          I laughed off reports about this kind of thing, thinking “omg who could possibly give a shit about what color their text bubble is in a group chat?” Later my gen Z office mate told me about how he uses an iPhone and cited this exact reason unironically. I was stunned into silence.

          • zwekihoyy@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            there’s a decent amount of research into the psychology behind it and how reading white text on the light green is more difficult than on the blue bubble. it’s rather interesting.

            edit: although I would think dark mode should change that effect a little bit

        • HouseWolf@lemmy.ml
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          I’ve noticed it more and more over the years how people will fight tooth and nail to defend a product for no other reason than self validation.

          I’ve even had one person try to sell me on OperaGX as if they were reading off an AD, When I asked them technical questions about it they just pulled the conversation back to selling up the gimmicks. I finally straight asked them why they were advertising something for a company they don’t work for and they just got offended. Was kinda a surreal experience.

        • russjr08@outpost.zeuslink.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Oh absolutely, I am sadly all far too well aware of those cases (especially the “green bubbles” thing, I’ve never rolled my eyes harder at a silly situation).

          It’s not even strictly a tech thing either, its a long standing thing in human history no matter where you look, and unfortunately I don’t see it going away any time soon.

      • bemenaker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It sounds like you need to be complaining to nvidia to do a better job with their drivers. If the drivers suck, it doesn’t matter what wayland does.

    • Auli@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Ok but then how about the developers of X11 who decided it wasn’t worth fixing the issues and to start a new project called Wayland where they could start from scratch to fix the issues. Does that change your mind at all?

      • duncesplayed@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        That would be a “technical merit”, which the article author claims is irrelevant to the discussion.

        • West Siberian Laika@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          I don’t want to sound rude, but how old is your setup? Are you using a desktop or a laptop computer?

          Because I’m daily driving a late 2015 Dell XPS 9350 and X11 just ain’t cutting it, even though the laptop is nearly a decade old. On X11, its trackpad would be garbage, GNOME’s animations would be stuttery, and fractional scaling would be a mess, because I have a docking station with a 75 Hz ultrawide monitor, meaning that I must utilise both 125% and 100% scaling factors, as well as 60 Hz and 75 Hz refresh rates and different resolutions. Sure, not everyone uses multi monitor setups, but those who do serious office tasks or content production work often cannot imagine their workflow without multiple monitors. Point is, X11 is to ancient to handle such tasks smoothly, reliably and efficiently.

          • 0x0@social.rocketsfall.net
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            It’s not rude - don’t worry. My main desktop runs 4 monitors at 1080p. GPU is an RX 580. I have a number of other laptops/tablets/desktops running similar configs, including ones with mixed resolutions and refresh rates. Gaming/video production/programming.

            I think people are really discounting the amount of value experience with a certain set of software has to the end-user. Wayland isn’t a drop-in replacement. There’s a new suite of software and tooling around it that has to be learned, and this is by design. Understandably, many people focus on getting displays working properly on mixed resolutions and refresh rates, but there are concerns for usability/accessibility outside of that.

    • michaelrose@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      5
      ·
      1 year ago

      This is literally the exact bad attitude of your average Wayland proponent. The thing which has worked for 20 years doesn’t work you just hallucinated it along with all the show stopper bugs you encountered when you tried to switch to Wayland.

      • orangeboats@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 year ago

        It’s really not “working” per se. VRR was breaking on X11, sandboxing was breaking on X11, fractional scaling and mixed DPI were breaking on X11.

        How did we achieve HiDPI on X11? By changing Xft.dpi (breaking old things) or adding random environment variables (terrible UX - do you want to worsen Linux desktop’s reputation even more?). Changing XRandR? May your battery life be long lasting.

        There’s genuinely no good way to mix different DPIs on the same X server, even with only one screen! On Windows and Mac, the old LoDPI applications are scaled up automatically by the compositor, but this just doesn’t exist on X11.

        I focus on DPI because this is a huge weakness of X11 and there is a foreseeable trend of people using HiDPI monitors more and more, there are tons of other weaknesses, but people tend to sweep them under the rug as being exotic. And please don’t call HiDPI setups exotic. For all the jokes we see on the eternal 768p screens that laptop manufacturers like to use, the mainstream laptops are moving onto 1080p. On a 13" screen, shit looks tiny if you don’t scale it up by 150%.

        You can hate on Wayland, you may work on an alternative called Delaware for all I care, but let’s admit that X11 doesn’t really work anymore and is not the future of Linux desktop.

        • michaelrose@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          1 year ago

          Outside of your fantasies high DPI works fine. Modern QT apps seem to pick it up fairly automatically now and GTK does indeed require a variable which could trivially be set for the user.

          Your desktop relies on a wide variety of env variables to function correctly which doesn’t bother you because they are set for you. This has literally worked fine for me for years. I have no idea what you think you are talking about. Wayland doesn’t work AT ALL for me out of the box without ensuring some variables are set because my distro doesn’t do that for me this doesn’t mean Wayland is broken.

          • orangeboats@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            They pick up “automatically” because of how your DE sets up the relevant envvars for you, there is nothing in the protocol that actually tells the applications “hey, this monitor needs X% DPI scaling!”.

            The side effect of this deficiency in the protocol is very obvious, you can’t mix DPIs, because the envvars or Xft.dpi are global and not per-application. Have you seen a blurry LoDPI X11 window sitting right beside a HiDPI X11 window? Or an X11 window changing its DPI dynamically as you move it across monitors with different DPIs?

            The fact that SDL2 still doesn’t support HiDPI on X11 when it already does on Macs, Windows, and Linux Wayland should tell you something.

            Don’t throw the “it works for me” excuse on me. Because I can throw it back on you too: “Wayland works on my machine”. X11 is utterly broken, just admit it. You are welcome to develop another X11 if you want.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              Nothing is set automatically I run a window manager and it starts what I tell it to start. I observed that at present fewer env variables are now required to obtain proper scaling. I did not personally dig into the reasoning for same because frankly its an implementation detail. I just noted that qt apps like dolphin and calibre are scaled without benefit of configuration while GTK apps like Firefox don’t work without GDK_SCALE set.

              X actually exposes both the resolution and physical size of displays. This gives you the DPI if you happen to have mastered basic math. I’ve no idea if this is in fact used but your statement NOTHING provides that is trivially disprovable by runing xrandr --verbose. It is entirely possible that its picking up on the globally set DPI instead which in this instance would yield the exact same result because and wait for it.

              You don’t in fact actually even need apps to be aware of different DPI or dynamically adjust you may scale everything up to the exact same DPI and let X scale it down to the physical resolution. This doesn’t result in a blurry screen. The 1080p screen while not as pretty as the higher res screens looks neither better nor worse than it looks without scaling.

              Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. It probably actually supported it when you yourself were in elementary school.

              • orangeboats@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                edit-2
                1 year ago

                Nothing is set automatically I run a window manager and it starts what I tell it to start. I observed that at present fewer env variables are now required to obtain proper scaling.

                Fun fact: zero envvars are needed for HiDPI support on Wayland.

                You do possibly need envvars to enable Wayland support though, but the latest releases of Qt6, GTK4, SDL3 etc. are enabling Wayland by default these days so in the future everything will work out of the box. By default.

                X actually exposes both the resolution and physical size of displays. This gives you the DPI if you happen to have mastered basic math. I’ve no idea if this is in fact used but your statement NOTHING provides that is trivially disprovable by runing xrandr --verbose.

                Did I say XRandR and mixed DPI in my previous comments? Yeah, I think I did. What the Qt applications currently do is choosing the max DPI and sticking with it. There are some nasty side effects, as I will explain below.

                You don’t in fact actually even need apps to be aware of different DPI or dynamically adjust you may scale everything up to the exact same DPI and let X scale it down to the physical resolution.

                Don’t forget the side effect: GPU demands and/or CPU demands (depending on the renderer) increase… a lot, nearly 2x in some cases. This might not be acceptable in applications like laptops - have you used projectors in college?

                Anecdotally speaking, I gained 1 to 2 hours of battery life just by ditching X11, it’s impressive considering my battery life was like 4 to 5 hours back then. Now it’s actually competitive with Windows which usually gets 6 to 7 hours of battery life.

                Furthermore, scaling up and down in multiple passes, instead of letting the clients doing it in “one go” and have the compositor scan it directly onto your screen, leads to problems in font rendering because of some antialiasing shenanigans in addition to the power consumption increase. It’s the very reason why Wayland added a fractional_scaling protocol.

                Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use.

                Apparently the “nobody” includes GTK, Qt, SDL, and all the mainstream DEs (Xfce and Cinnamon included - even they are preparing to add Wayland support). 90% of the programs I use actually support Wayland pretty well. Good job lad, you managed to invalidate your own argument.

                Besides that, you still haven’t properly answered the question of mixed DPI: have you seen a properly-scaled-up LoDPI X11 application? It’s a big problem for XWayland developers. See it here. And yes… those developers are (were?) X11 developers. I think they know how unworkable X11 is, more than you do.

                • michaelrose@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  1 year ago

                  It doesn’t require a meaningful or measurable difference in CPU/GPU to scale my third monitor. That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise. In all cases nearly all of the CPU power is going to the multitude of applications not drawing more pixels.

                  The concern about battery life is also probably equally pointless. People are normally worrying about scaling multiple monitors in places where they have another exciting innovation available… the power cord. If you are kicking it with portable monitors at the coffee shop you are infinitely more worried about powering the actual display more so than GPU power required to scale it. Also some of us have actual desktops.

                  Furthermore, scaling up and down in multiple passes, instead of letting the clients doing it in “one go” and have the compositor scan it directly onto your screen, leads to problems in font rendering

                  There are some nasty side effects

                  There just aren’t. It’s not blurry. There aren’t artifacts. It doesn’t take a meaningful amount of resources. I set literally one env variable and it works without issue. In order for you to feel you are justified you absolutely NEED this to be a hacky broken configuration with disadvantages. It’s not its a perfectly trivial configuration and Wayland basically offers nothing over it save for running in place to get back to the same spot. You complain about the need to set an env var but to switch to wayland would be a substantial amount of effort and you can’t articulate one actual benefit just fictional deficits I can refute by turning my head slightly.

                  Your responses make me think you aren’t actually listening for instance

                  11 is utterly broken, just admit it. You are welcome to develop another X11 if you want.

                  Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. Apparently the “nobody” includes GTK, Qt, SDL…

                  Please attend more carefully. Scaling and High DPI was a thing on X back when Wayland didn’t work at all. xrandr supported --scale back in 2001 and high DPI support was a thing in 2012. Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit. Those of us who aren’t in junior high school needed things like High DPI and scaling back when Wayland wasn’t remotely usable and now that it is starting to get semi usable I for one see nothing but hassle.

                  I don’t have a bunch of screen tearing, I don’t have bad battery life, I have working high DPI, I have mixed DPI I don’t have a blurry mess. These aren’t actual disadvantages this is just you failing to attend to features that already exist.

                  Imagine if at the advent of automatic transmissions you had 500 assholes on car forums claiming that manual transmission cars can’t drive over 50MPH/80KPH and break down constantly instead of touting actual advantages. It’s obnoxious to those of us who discovered Linux 20 years ago rather than last week.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              Why on earth would I develop “another X11” instead of using the one that still works perfectly fine?

        • WuTang @lemmy.ninja
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          1 year ago

          How did we achieve HiDPI on X11? By changing Xft.dpi (breaking old things) or adding random environment variables (terrible UX - do you want to worsen Linux desktop’s reputation even more?).

          You seems to have dealt with windows recentely.

          Regarding linux on desktop… as long as you don’t involve smelly gamers, it’s perfectly fine.

          • orangeboats@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            I have been daily-driving Linux for years, but I do boot into Windows from time to time. Even then, I recognize that the out-of-the-box experience of Linux desktop isn’t as good as it can be, although it’s been rapidly improving.

  • calzone_gigante@lemmy.eco.br
    link
    fedilink
    arrow-up
    25
    ·
    1 year ago

    Replacing good legacy will always be a struggle. X11 works pretty well and has been stable for decades. Most of the things that suck about it already have workarounds.

    The advantages of Wayland are not directly visible for the end user. The security part will be great once it’s completely integrated on the distributions to give granular permissions to software. The simpler apis and greater performance will help libraries creators, but most developers don’t touch X directly and won’t touch Wayland either.

    Being stable for a couple of months is not good enough. People will use it once distros trust it enough to make it default, and this will probably only happen once Wayland or its compatibility tools work with most software and major applications work significantly better on it.

  • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
    link
    fedilink
    arrow-up
    30
    arrow-down
    6
    ·
    1 year ago

    Every time I try Wayland, something doesn’t work. The time before last, subpixel DPI scaling was badly broken. This last time, there’s some glitch where the screen jumps right a couple pixels (and back) every dozen seconds. I don’t have any interest in spending my time trying to fix Wayland issues when X just works.

    • English Mobster@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      5
      ·
      1 year ago

      Amen.

      When something crashes on Wayland, my entire system goes down. When something crashes on X, I can at least kill it with a GUI. I refuse to use Wayland as long as it has the potential to freeze my entire machine.

      (This is on KDE Plasma.)

      • rbar@lemmy.world
        link
        fedilink
        arrow-up
        14
        arrow-down
        1
        ·
        1 year ago

        I just don’t think KDE will be worth it on plasma until KDE 6 / Qt 6. Basic components like SDDM supporting Wayland still have to be solved before KDR provides a first class experience. Try messing around with environments like sway, Hyprland, and Gnome the stability difference is night and day compared to KDE.

        • inspector@gadgetro.id
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          Yep I second this. I’ve been using GNOME, hyprland and sway interchangeably since 2021 Oct on my system as my main DEs. I recently wanted to give Plasma Wayland a try. Was met with multiple crashes, and freezes that required me to long press the power button to reboot to get it working again.

          While the new rootless wayland on SDDM worked fine for me, there are several things in Plasma that still don’t yet have support for Wayland. I could never get screen sharing to be reliable on Plasma Wayland, despite having the right portals installed.

          GNOME, Sway, and Hyprland are miles ahead in having a stable system on Wayland.

  • AItoothbrush@lemmy.zip
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 year ago

    I switched to wayland because of screen tearing and it fixed it. Idk if x is still glitchy on my new laptop but i dont really care. Also hyprland is really cool so im happy with wayland.

  • slembcke@lemmy.ml
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    I’ve been using it for a few years now, and it fixes a lot of little issues I have with X11, and at this point brings very few of its own. ALTHOUGH, I don’t have any Nvidia GPUs, and people seem to think it works for crap on them. I keep hearing “Ah, this will finally fix it!”, but I don’t know what the actual status is. You have the hardware you have, so unless you are going to buy something different to try Wayland… eh… I guess it never hurts to try. It’s pretty trivial to toggle on and off.

    • Sh1nyM3t4l4ss@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      I have a laptop with hybrid Intel+NVIDIA graphics, and I can say that offloading games and such to the dGPU while letting the iGPU handle everything else works with zero issues for me on Wayland.

      On desktops where the NVIDIA GPU handles everything I don’t have that much experience on Wayland although when I did try it earlier this year it was surprisingly good, but with occasional dumb bugs like Plasma panels freezing or XWayland apps breaking in funny ways. Although honestly just a few years back running Plasma X11 on NVIDIA wasn’t much better than Wayland now.

      • slembcke@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Interesting. My laptop died a little while ago, and I needed to demo a game I’m working on at a local convention. My wife had a hybrid GPU machine and let me swap in my SSD to run it. The drive had PopOS on it without the NV drivers. It did seem to run wayland fine on the internal display, but the external display was picky. (I wanted to demo on a bigger display) The only way to get the game to run smoothly was to disable the internal display using X11, and run the game using GL instead of Vulkan. >_<

        So yeah, kinda mostly worked if I wanted it to be a laptop. I can see how it gets to be a pain if your needs are specific though.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    When both NVIDIA and KDE work well with Wayland, most of the anti-Wayland energy will go away. The advocates will calm down too bar cause they will have won.

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      I don’t think the sentiment is ‘anti-wayland’. Most people just don’t care. I’m using Awesome WM and it doesn’t support Wayland. As OP says, why would I rewrite all my plugins and config just to the sake of switching to Wayland? I would have to invest a lot of time and what will I gain? Absolutely nothing. On my work computer I have different distro and I’m using Cinnamon. I think it uses Wayland but I didn’t even bother to check. It works exactly the same as Gnome on X11. Why would I care?

    • cobra89@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Counterpoint, I have all AMD machines (CPU and GPU) and each time I’ve tried Wayland I’ve immediately run into bugs that make it unusable. Maybe it’s because both my setups have multiple monitors with different resolutions, but I don’t see why that use case is so hard to support. And I’m running the latest versions of Wayland and KDE so it’s not an issue of me running outdated versions that already have bug fixes supplied upstream. If Wayland can’t handle just basic desktop use with multiple resolutions why would I go through the effort to use it? Fix the basics first.

      • the_weez@midwest.social
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        1 year ago

        My experience has been the opposite. I won’t use x after using Wayland on AMD for years it just feels so much smoother. On arch with gnome Wayland has been fantastic.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    5
    ·
    1 year ago

    Wayland’s major “technical merits”, as far as I can tell, are a lack of screen tearing, slightly faster rendering under some circumstances and better handling of touchscreens. That’s it. If you don’t have a touchscreen and aren’t a gamer (few non-gamers care all that much about tearing or about framerates above 60Hz), Wayland has no real advantages to the user that I’m aware of.

    X is network-transparent, more widely compatible, and arguably more extensible. Most users don’t care about those things either.

    Wayland has an advantage in attracting developers because it has less accumulated technical debt and general code cruft. That doesn’t make it better for users, though. Most Wayland evangelists I run into seem to be devs who are more interested in the design of the graphics stack than whether it makes a difference in the real world.

    So, as with so many things, “merit” is in the eye of the beholder. People should use what works for them.

      • nyan@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        I’ll give you the multiple screens (not a use case I have myself, so I don’t pay attention to support quality). Isolation of applications is another thing that most users don’t really care that much about, I would say.

        • tal@kbin.social
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          It’s legitimately important if you want to be able to pull random software from places and not have your system compromised, a la smartphone OSes.

          It’s not the whole story – things still aren’t entirely sandboxed aside from that – but without it, the GUI is a big security hole.

      • MazonnaCara89@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        And don’t forget 1:1 gestures and the Crash-Resilient Wayland Compositing that keep the application alive even tho the “compositor” crash, so it does restart without any data loss.

        Edit: forgot to mention the lockscreen protocol, because on xorg if the lockscreen crash then you view the desktop and you have the device unlocked!

  • theshatterstone54@feddit.uk
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    People on Unix environments that don’t have Wayland support.

    That’s a big one. All the *BSD folk will keep on using X at least until it gets proper support over there (which might never happen) and even then it will still be boycotted by some BSD users for other reasons.

    People using mainstream desktop environments that already support Wayland … [but their distro hasn’t made the switch]

    I agree about that. Many people don’t care and will just use whatever their distro tells them to use. As you said, there’s usually good reason for it.

    People using desktop environments or custom X setups that don’t (currently) support Wayland.

    This is another one, and is actually one I kinda fall under. I use a tiling window manager. The tiling Wayland compositors are often times not as polished, and a big annoyance for me personally is the fact that most of them (River, Hyprland, DWL) don’t come with a bar. Of my X Window Managers, AwesomeWM, DWM and Qtile already have their own bars. BSPWM is basically supposed to be used with Polybar, the same way XMonad and xmobar are basically made for each other. On Wayland, Somebar is made for DWL, but waybar and yambar work really well with it. Sway has swaybar, but waybar works perfectly with it. Both Waybar and yambar work great with River. And there’s Waybar, and gBar, and other bars for Hyprland. And that’s without mentioning EWW, which can be used to make a bar.

    Another issue I have is that my touchpad doesn’t get detected if I’m holding down a key. So if I’m playing Minecraft and I’m trying to turn around and run away from a zombie using my touchpad because my mouse’s battery ran out, I have to do these actions one by one and hope I survive, or just let myself die. That’s just an example, but I have noticed it in other games as well. No such issues on X. And I’ve also had Powerwash Simulator, ran through wine, just crash on me in some (Qtile or Hyprland), but not other compositors. In DWL, I couldn’t turn all the way around and forbsome reason my movement was restricted to 270°, and in River I had 0 issues.

    When you have a monopoly

    You’re saying this as if X didn’t have a monopoly over Unix graphics.

    • Zamundaaa@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Another issue I have is that my touchpad doesn’t get detected if I’m holding down a key

      That’s a libinput feature, meant to prevent you from accidentally using the touchpad when you’re typing. You can disable it if you want.

    • English Mobster@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      If I run Satisfactory via Vulkan on X, it causes my entire desktop to flicker until I close the game, on all screens. Annoying, but at least I can make it go away.

      If I run Satisfactory via Vulkan on Wayland, it crashes Wayland and my entire computer freezes until I hard reboot it by pressing the power button. That is absolutely unacceptable.

      (Satisfactory on DX12 works fine for both, but the point is Wayland is still much more likely to fail catastrophically.)

  • HakFoo@lemmy.sdf.org
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    What has kept me away from Wayland is the tendency to be dependent on the compositor for so much.

    I use my preferred X11 window manager for largely aesthetic reasons, but by and large, I can swap it out and the rest of the software doesn’t give a damn. At most, you might have to tweak a RC file to fix missing custom assumptions (i. e. disabling decorations on full-screenified Proton games)

    It seems like on Wayland, there’s a lot more of a “if you aren’t using GNOME or KDE, the odds something meaningful breaks are much higher.” Aside from the perceived bulk of these environments, they’re highly opinionated-- I suspect it would be a major production number to hammer them into a shape that looked like FVWM or WindowMaker, even if you only wanted to match a single theme’s aesthetics (as opposed to, say, FVWM’s dynamic configurability).

    • OneCardboardBox@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      If you find a Wayland compositor that’s based on wl-roots, you basically get that ability for swapping out the window manager.

      The wl-roots project aims to be a common library that any project can pull in without having to implement the required Wayland protocols themselves.

      • HakFoo@lemmy.sdf.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        But even that’s a relatively high bar. Wl-roots is self-described as “60000 lines of code you don’t have to write yourself”, and any arbitrary compositor may not use it or may not be up-to-date with it. In X11, you don’t need 60,000 lines of code to be functional. Hell, the example Window Manager that was printed as a couple of chapters in the old X11R5 reference books works well enough especially considering its size.

        I feel like I missed the historic genesis of this particular quagmire. Knowing that a composer was essential, you’d expect developers would want to make very robust core functionality-- a super-rich libweston or something like wl-roots, so that “real” compositors would just be paper-thin extensions that answered the opinionated parts. Did early Wayland design get bogged down on embedded-style use cases where such features were seen as too expensive (compare: no built-in printf in C), or was it a deliberate territory grab by early compositor developers, trying to turn it into a place they could to gain competitive advantage?

  • BlinkerFluid@lemmy.one
    link
    fedilink
    arrow-up
    13
    arrow-down
    3
    ·
    edit-2
    1 year ago

    The utilities that replace the utilities you’re used to on X11 work great, so do the utilities that already work on X11.

    That’s um… not the best motivation.

  • hubobes@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    4
    ·
    1 year ago

    I literally don’t care. I don’t have any issues with X11 on PopOS and I will switch when System76 decides it’s time.

    • WuTang @lemmy.ninja
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      it should not. the 1st one should be that it is not opensource and 100% the cause of a X blackscreen on upgrades.

      AMD plays the game (no pun intended), so let’s go with it. If you need nvidia for CUDA for ML, standard are on the way to allow to use any GPU.

      • TechieDamien@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I already do ml on amd, and it works great. There’s usually a few extra steps that need doing as binaries aren’t always available, but that, too, will improve with time.

    • woelkchen@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      5
      ·
      1 year ago

      Only reason I’m not using it is Nvidia.

      Don’t buy Nvidia GPUs. NVidia’s broken Linux support is a well-known fact since at least a decade.

      • Ineocla@lemmy.ml
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        For gaming AMD is as good as NVIDIA or even better. For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

        • woelkchen@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

          Those specific compute tasks are not “anything else”. Pretty much every single everyday task by common people works better on GPUs with proper Mesa drivers than GeForce and there is absolutely no reason that you need to output your graphics from the NVidia GPU anyway. Do your compute tasks on dedicated Nvidia hardware if you have to. Even notebooks come with AMD and Intel iGPUs that are perfectly fine for non-gaming graphics output.

          • Ineocla@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Yep you’re right. Mesa covers almost anything. But streaming and recording, photo and video editing, 3d rendering ai training etc aren’t “specific compute tasks” they represent the vast majority of the market with billions of dollars in revenue. And no the solution isn’t to use another gpu. It’s for AMD to make their software stack actually usable

            • woelkchen@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              photo and video editing

              Which photo editor for Linux even supports special NVidia features? It’s not like Linux has Photoshop or something like that – there aren’t that many photo editors under Linux. It’s one of the areas Windows people complain most loudly about Linux. Seems to me your conflated Windows with Linux when hyping Nvidia above anything.

              ai training etc aren’t “specific compute tasks”

              AI training isn’t a specific compute task? What is it then? Why do you train your AI on the regular graphics output GPU and not on dedicated hardware like sane people?