Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)US
Uschteinheim @lemmy.world
Posts 11
Comments 27

Summary of all problems I have with Linux

gist.github.com Summary of all problems I have with the Linux Desktop

Summary of all problems I have with the Linux Desktop - linux_sucks.md

Summary of all problems I have with the Linux Desktop

Source: [(https://gist.github.com/PJB3005/424b26b2cd42e2a596557f0bcfc6f2b2)]

Linux Desktop Issues

This is basically my manifesto of why Linux sucks and I keep using Windows as a desktop OS. This is both as a developer platform and end-user targeting.

Look: I would love to be able to use Linux as a daily driver. KDE is amazing and they clearly put far more effort into the desktop experience/UI than Windows (just the volume mixer alone). There are simply far too many underlying and infrastructural problems to the Linux desktop that none of KDE's great UI changes can make up for. I want Linux fanboys, developers, etc... to stop sticking their damn head in the sand about these issues and admit that Linux is still decades behind in some basic infrastructure. This shit can't get fixed if people refuse to admit it's broken in the first place, which some people are far too happy to do. Desktop App Experience & Developer Tools

Windows has far better desktop apps than Linux, and thanks to WSL, I have all the CLI apps of Linux too. While I do believe KDE Plasma is a much better desktop environment than Windows 10, the app ecosystem on Windows beats Linux into the ground hard. This goes for normal poweruser stuff aswell as programmer-oriented. I can't provide a perspective on non-poweruser/non-technical people because, well, I'm not one of those. Coming up with lists like these is hard because I take everything here for granted now, so if I remember this document exists this list will expand a lot.

The common thread here between all of these tools I will be listing is that all "alternatives" you can have on Linux have 10% of the features at best, or are way more cumbersome to use. Getting any more than that probably involves anything from shell scripting and awkward command line tools to trying to interpret /proc.

For poweruser tools, here are some of my favorite that have literally no comparable analogue on Linux:

ShareX: Yeah, there's screenshot utilities, but those are all a bad joke compared to the impressively useful feature set of ShareX. You'll never know how useful easy-to-access OCR is until you need it and remember it exists.1 All "alternatives" proposed for Linux are cumbersome to use, don't have 10% of the feature set, far more complex to set up with hotkeys and all that, etc...

Programmer tools are even worse here. While on Windows most of your functionality will be accessible through GUI tools, on Linux often the best you have is some curses-based terminal tool. Bad tools are gonna be bad tools, but good GUI tools will destroy anything that is "good" by CLI standards. The general impression I get is that Linux users put up with subpar commandline development tools, while Windows devs get far-superior GUI tools. If poweruser/casual stuff was all of it, I wouldn't even have this whole header here because obviously a large part of the problem is just desktop marketshare. But I am not sure how much that holds up when my biggest problem here is developer tools, not poweruser/casual stuff. You would think that Linux, an OS often associated with developers, would have better tools here. But nope, that's not the case.

Programmer tools: Process Hacker: it's htop on steroids. Debugging tools: the state of the art for Linux here is still GDB/LLDB or barely functional wrappers around it. WinDBG and many other tools like x64dbg are way ahead and have functional UIs. Event Tracing for Windows (ETW) and accompanying tools like WPA DependenciesGui/Process Hacker's peview.exe: I hate having to use a vague combination of nm, ldd, etc... with barely rememberable flags. These tools are much more well-organized and easier to use.

I would like to give honors to Powershell, which is now cross-platform so technically doesn't fit on this list (and yes, I use it as daily-driver Linux shell). But it's still a Windows-centric Microsoft invention that kicks the teeth out of any Linux shell, in my humble opinion.

I also need to clarify that, because I mostly work in C#/.NET, some of the tools I use for that are Windows-only (PerfView comes to mind). I recognize that this is not Linux' fault however so will not be listing them here. I am happy to report that Rider is a VERY good developer experience for C# on Linux however. Fractional DPI Scaling

DPI scaling is the act of scaling an app up by factors like 150% to make UI elements have the correct size on devices with different DPI factors. A typical "100%" desktop monitor is generally ~96 DPI (number used by Windows and some X11 apps), while macOS generally used to use ~72 DPI as baseline IIRC. If you have a 1080p 15" laptop monitor, that is closer to 125% DPI scale for Windows, so all UI elements need to be 125% bigger in pixel size to have the same size on a typical desktop monitor compared to the aforementioned laptop. "Fractional" refers to allowing apps to scale non-integer amounts (125%) because doing purely integer amounts is easier than fractional.

Fractional DPI scaling is critical, since many configurations (especially laptops) need it to have the UI be sized appropriately. Microsoft has realized this technology is necessary for more than a decade now. Windows Vista (2007) had fractional DPI scaling on a per-system basis (Windows XP allowed changing font size DPI but other app elements didn't scale with properly). WPF (2006) was designed with vector drawing in mind and, as such, is fully fractional DPI scalable. They also managed to retrofit old win32 apps to support fractional DPI scaling, with some caveats. More modern Windows versions have made various incremental improvements along the ways and fractional DPI scaling works very well: multimonitor DPI scaling (different monitors can have different scale factors, and moving apps between automatically adjusts the app's scale factor to match), proper handling of apps that are not DPI-aware on various levels (scaling them up blurrily in compositor; better than nothing), allowing per-app overrides for this, etc... Basically it works great if you don't mind the occassional app old app being broken.

Apple couldn't be arsed and instead of implementing proper fractional DPI scaling in their toolkits, they just started shipping "retina" displays with all their hardware (because, you know, Apple exclusively controls both their own hardware and software they ship). These retina displays are more than 2x the DPI of most typical hardware from other (lower cost) vendors. For fractional scaling, macOS just draws apps at 2x internally and uses the compositor to scale it. Again, they can get away with this because their displays are so damn high-DPI that you can't notice. They also had to break a few eggs and kill features like subpixel font rendering for this, and to this day macOS' font rendering is trash on low-DPI displays.2 Basically it works great if you only ever buy 4-figure 5K external monitors from Apple.

Where is Linux on this? basically nowhere, and it's actively regressing.

X11 reported DPI to applications and some apps were able to take advantage of the raw DPI number. There was no compositor-level support for scaling unaware apps, apps generally sucked at implementing it, no multimonitor DPI, etc...

Wayland decided that integer DPI scaling is all you're gonna get. At least it has multi-monitor DPI and compositor handling now. Yes. This means my 1080 15" laptop (which isn't an uncommon configuration) will not have fractional DPI scaling. All apps will be blurry nightmares by being downscaled by the compositor.

A huge part of this is because UI toolkits do not support fractional DPI scaling. It's 2022 and GTK/GNOME actively refuses to acknowledge that fractional DPI scaling is even necessary. It's been 15 years since WPF came out, with full fractional scaling support. GNOME/GTK/System76 developers actually intend to let integer scaling + compositor downscaling be the solution for fractional DPI scaling on Linux. Saying "Apple does it like this, it's the correct solution". As I have stated before, this is complete horseshit because Apple only gets away with this because they butchered their font stack and started shipping retina displays in everything. I don't know what the plan for System76's hardware line is but GNOME developers cannot take this shit for granted.

Qt has some support for DPI scaling since ~2016 but fractional DPI scaling, while somewhat implemented, is not officially supported. In my experience using KDE's apps on Windows it mostly works, but there are some problems which seem to stem more from themes and such. I hope it may be possible to fix this on KDE's side, and to eventually enable it for real on Wayland.

All in all, the situation here is bleak if KDE developers never get a proper fractional DPI scaling protocol into Wayland/their themes and GNOME/Sys76 developers keep larping as a more incompetent form of Apple.

UPDATE: a fractional scaling protocol is in the works. I expect GNOME to never properly support it. Fundamental Graphics Infrastructure

Linux' graphics infrastructure is held together by strings and duct tape. And no, that's not because "nvidia is bad". Wayland clearly had no trouble leaving Nvidia behind when they refused to support GBM, so Nvidia is not holding us back here. The APIs involved are decades behind and just awful.

OpenGL and everything surrounding it is a terrible API. I don't know if you've ever had the misfortune of using it, but D3D and DXGI on Windows are so much nicer and functional in every way. DXGI originates from 2006 (Windows Vista/D3D10) and OpenGL is still a nightmare in comparison. EGL is such a pile of garbage, I hate having to use it. Did they really need separate extensions for "the concept of GPUs exists", "you can enumerate GPUs", "you can get info about enumerated GPUs"? Complete nightmare to work with. And the extensions are inconsistently implemented and there's no writing on what is available where. DXGI just fucking works and does all of this in a clear API doc on docs.microsoft.com with a 100% guaranteed of what APIs will be available where.

Vulkan fixes most of the problems with EGL being awful, but then has more of its own such as old hardware support. On Windows, I can use D3D11 and support 10-15 years of hardware (depending on minimum FL). The API will be 10x more sane than the nightmare that is OpenGL, is more than sufficient for my use cases, and far more compatible than Vulkan (you get like 6 years of hardware support on Vulkan Windows, for intel iGPUs). It will have functional and modern WSI (window system integration) which is not the case with OpenGL. Etc... I am genuinely considering making a D3D11 renderer for my cross-platform game because of all the advantages it would provide to Windows players, despite obviously leaving mac/Linux users in the dust. I also have no need for the huge added complexity of Vulkan, so a Vulkan renderer would be harder to develop.

Windows' graphics stack is also much more robust. On Windows, I can force all (all, I am on a laptop) my GPUs to be reset (dxcap -forcetdr) and I get about 2-3 second of black monitor and almost all apps except games stay open without a hitch. On Linux you would currently be logged out or have to reboot your system. This is also how you can update GPU drivers on a live system without trouble. A lot of the problem here is that OpenGL was never designed for this (defaults to trying to hide device loss errors, which doesn't work with a modern graphics API model at all). Fucking Wayland

Jesus Christ I have to talk about this joke of a protocol. See, on Linux there's currently two protocols for doing stuff like opening windows, receiving keyboard input, and so on:

X11, which dates back even older than Windows. Wayland, which is the new up and coming thing.

X11 is extremely old and crusty. So old that at some point it was decided that it would be better to re-invent the whole goddman wheel and start from scratch3. And now we have Wayland.

Because Wayland is new, of course, they decided to try to "fix" god damn everything. So now the whole thing is engineered on a basis of privacy and sandboxing and all that goodness. This means many fundamental things all other OSes allow like apps moving your cursor around are now illegal, and they're effectively working backwards towards basic functionality even X11 has. But uh, more secure I guess.

It's a fractured nightmare of extensions and protocols. The whole standards process is slow as molasses and basic functionality gets bikeshedded about for years.

Some of the hilarious4 bullshit that comes out of this. I'm sure there's going to be more:

GNOME effectively does not implement Wayland as a public protocol. See heading down below. Wayland surface suspension troubles

Wayland compositor blocks client apps if they're minimized. No other windowing system does it like this. This leads to terrible hacks as programs try to work around this insane behavior. The proposal to fix this nightmare is filled with highlights such as: Blaming all games for being terrible software or something nonsensical like that. This is an intentional feature to save power, we can't trust the apps to do this themselves (despite many apps like Firefox having worse power use due to inability to suspend video decode).

Alternative proposals that don't fix the core problem. More blaming games. A Valve employee getting the Vulkan spec updated to make this behavior illegal, therefore hard-forcing this issue to get fixed sanely eventually. Arguments about privacy, saying that apps should not be allowed to know.

Wayland doesn't have a protocol for standard mouse cursors. This means that if an app wants to show "vertical resize bar" that matches your OS cursor theme, you have to dig through a ton of wild desktop-specific locations to load the correct bitmap and upload it directly.

Wayland's clipboard protocol requires the copied-from app to remain active or the clipboard will be lost. This is because clipboard content is only ever sent app-to-app via an IPC pipe, and there is no system for the data being saved if the app closes. Microsoft figured this out decades ago: copying small data is just stored by the OS somewhere, whereas large data copies can be sent app-to-app on paste with the app prompting "you will lose clipboard data" if closing it would lose the clipboard.

A command-line "copy to clipboard" utility has to run a background daemon to hold the clipboard alive. Christ.

GNOME doesn't support Wayland Yes, of course GNOME implements it, but not in a way anybody can actually fucking use.

This is quite evident from the following issue: GNOME doesn't support server-side decorations. This is a basic requirement of a windowing API, I want to be able to open a window and customize the contents, while the window border looks appropriate for the OS the user is running. This works on Windows, macOS, X11, Wayland KDE... and not Wayland GNOME. There's no protocol.

See, GNOME actually says you should just GTK. Full stop. Use GTK. People pointing out you can't make a Vulkan context in GTK? Crickets. Shit I'd be horrified if I had to interop with a library like GTK for my game's windowing layer. Use GTK. This is of course complete bullshit in every way: GNOME is literally telling people they shouldn't use Wayland and shouldn't use GTK. So GNOME doesn't publically support using Wayland, it's just an internal protocol GTK uses to communicate to their compositor.

Wow, that's good. I should start throwing the above claim out as an excuse to link this document to people more! ELF Linking Troubles

Linux uses ELF for binaries and dynamic linking. Unlike Windows' PE32, ELF does not keep track of which symbols are imported from which modules. That means that unlike on Windows, where foo.dll imports memcpy from VCRUNTIME140.dll or what have you, on Linux libfoo.so depends on libc.so and needs memcpy from anywhere. Symbols on Linux are process-global because of how the linkage model works, and this is a massive nightmare for forwards compatibility if you are shipping applications. Example down below.

This can already cause dumb linking bugs on its own. Static linking of dependencies has to be done carefully to avoid accidentally exporting them and conflicting with other dependencies that may be loaded. Etc...

To make matters worse, there is no direct syscall interface library like Windows' KERNEL32.dll. libc is both the expected kernel API interface on the system, and the sole C runtime library. The C runtime library is an implementation detail of the programming language you are using to build your app, and should be nothing else. Combining these into one library is massively problematic, because you cannot load multiple libcs into a process (due to the aforementioned ELF-sucks linking problems), but you need to be able to do that for forward and backwards compatibility, versioning, and encapsulation of dependencies (since libc is an implementation detail of your C compiler, NOT a system API). Windows can do this: you can have multiple C/C++ runtimes loaded into the process (although it's ill-advised for performance reasons, it can happen) and stuff just works. KERNEL32.dll contains the actual low-level kernel APIs like CreateFile and VirtualAlloc, while the C/C++ runtime libraries contain malloc, memcpy, open, etc...

Why do you need to be able to load multiple C/C++ runtimes into the process? Flatpak. The point of Flatpak is to provide a consistent base for applications. One of the libraries of that consistent base is libc. You know what else needs to be loaded by Flatpak apps? Your graphics driver. Guess what your graphics driver userspace libraries depend on? Flatpak works around this for Nvidia drivers by keeping a copy of the nvidia drivers that has to be separately updated and versioned from your system's drivers. As expected, this quickly breaks down, because Nvidia requires the user-space part of the driver to match the kernel-level version. However much people on Reddit and Twitter bash Nvidia for this, This is not Nvidia's fault for having a proprietary driver, having to duplicate the userspace video driver is insane whether it's Nvidia or Mesa. It's Linux' fault for having a fundamentally fucked dynamic linking model compared to Windows. I can only imagine the coming bullshit when "Discord doesn't work with the new AMD GPU because the Mesa in the Discord flatpak is too old".

My understanding is that libcapsule aims to be a solution here by allowing proper dependency encapsulation, and it's used by Steam's Linux Runtime/pressure vessel stuff (which is a far saner way to do it than Flatpak). I know of at least one distro which basically just loads the host OS graphics drivers into flatpak directly with no capsuling or anything which sounds like a compat nightmare but well not my problem.

And no, directly syscalling Linux is a bad solution that should not be necessary. Not to mention how difficult that is because, again, libc is both the syscall interface and the C runtime library. There are many nightmares this can cause because libc is a control freak. For example: if you never use pthread to make new threads, std::shared_ptr will be non-atomic because "well, there's no extra threads, doesn't need to be!". So creating threads without libc's pthread is unsafe and can cause bugs. Not great if you're going for compatibility. This also means you can't static link libc. I am not aware of any official glibc documentation for where it is legal to bypass glibc (directly syscalling). I'll concede I haven't looked, but knowing the glibc devs there isn't any, and you should not rely on them to not eat your face for using syscall without their consent.

This is absolutely never getting fixed at this point and it's gonna keep being a huge problem forever. Joy. It's not a problem on Windows because KERNEL32.dll and the C/C++ runtime are separate, and PE32 linking actually specifies which module (dll) to load symbols from. Glibc 2.35

glibc 2.35 (feb 2022) shipped a new dependency sorter or whatever. It is (at the time of writing) extremely buggy causing anything from angry valgrind, assert aborts, erroneous failing library loading behavior, to straight segfaults. All in ld.so. I had to spend a whole weekend debugging this, and as far as I could tell I was the first to find this. Then after another week of Linux users crashing in various new scenarios I said "fuck it" and passed a workaround env var from the launcher.

Look, bugs happen, but this is literally the lowest and most fundamental component in the system above the kernel. This leaves an extremely sour taste in my mouth, you know? Glibc 2.36

So glibc 2.36 came out and it broke EAC. Why did this happen? Well because glibc changed the default compiler options to remove DT_HASH structures from their ELF files in favor of exclusively their own DT_GNU_HASH structures.

DT_HASH is an entry in an ELF file to accelerate symbol lookups. It is standardized in the ELF specification and mandatory. It's not technically optimal, so around 2006 GNU came up with DT_GNU_HASH which is supposed to be better. Files can contain both, and if available glibc will use DT_GNU_HASH to go faster. Great! Glibc removed their DT_HASH structures (by just changing a linker option) from their own ELFs and this broke EAC (and some other software).

Some people try to lay blame on EAC for "why are they using this ancient format" but uh:

DT_GNU_HASH was never officially documented or standardized anywhere, except "look at the source code" (which is under (L)GPL by the way). DT_HASH is plenty documented on the other hand. DT_HASH is literally standardized as being required in the ELF format. GNU amd LLVM's linker only started adding DT_GNU_HASH by default to compilations since 2018 (of course software still included both for way longer if they manually passed the flags).

If you're the EAC dev and you need to manually parse ELF files as a tamper check, which would you pick?

Why did glibc change this default by the way? Oh yeah, to save 16 kilobytes of the binary. Amazing savings!

When shit started hitting the fan, people in the glibc bug reports of course started pulling out arguments like "EAC's devs should just support the newer standard! It's existed for 15 years already!". Read above to see what the problem with that nonsensical counter argument is.

glibc should not be breaking ABI like this. I'm sure they didn't expect this to break ABI so mistakes happen, but they should have instantly reverted it as soon as it did instead of doubling down. All for 16 kilobytes.

Bonus points: claiming DT_GNU_HASH is documented by linking to your own email from 15 years ago, which says it isn't Memory Management

Outside of the obvious jerk that Windows wastes memory by being bloated compared to most Linux distros, Windows has far superior memory management for desktop use.

Windows has no overcommit crap, so there's no OOM killer nonsense either. All memory can be backed either by physical RAM or by page file. It is also much better at swap management. If on Linux you run out of memory and the OOM killer doesn't fire its blindfolded drunk shotgun in time, your system is probably gonna completely grind to a halt and has to be REISUB'd. Windows has much better swap management and isn't as prone to this. As far as I know, Linux still cannot have memory compression together with disk swap at the same time. While you can set up multiple swap devices (one for memory compression, one for disk swap), the kernel cannot intelligently move data between these (basically, it fills up memory compression then starts dumping stuff to disk). Windows and macOS can do this no problem. Windows can also manage swap prioritization and such based on desktop use (e.g. prioritizing active window, deprioritizing minimized windows). Good luck getting this on Linux, System76 only just came out with their script for CPU prioritization based on active window (technology that Windows probably had for 2 decades by now) OpenSSL 3 & Ubuntu

OpenSSL 3 broke ABI compatibility massively. Ubuntu 22.04 only ships OpenSSL 3, not including OpenSSL 1.1 for backwards compat (like Arch seems to do).

If you published a .NET 5 app in August 2022, it will not work on Ubuntu 22.04. This is a distro which came out only 8 months after you published your app. .NET Core 3.1 was still in LTS (and also broken).

OpenSSL 3 was released only 7 months before Ubuntu 22.04 came out. The fact that Ubuntu deems it reasonable to just completely break backwards compatibility on such a tight timeframe is absolutely fucking obscene. I have no words to describe the degree of blatant disregard for basic backwards compatibility here. This is the kind of slow-moving break one would do over the span of a decade, not in a 7 month speedrun schedule to save a couple goddamn megabytes.

Somebody at Canonical thought twice about dropping OpenSSL 1.1, and decided that yes, Ubuntu 22.04 should just shit all over backwards compatibility to save some megabytes. When one of the biggest Linux distros, often considered a "baseline" for various things, just completely disregards basic principles like "a program compiled a year ago should keep working" I genuinely do not know what to tell you except that there is no fucking hope for this platform in its current governance.5 Targeting Linux As A Developer

"Linux is better for developers" is a common mantra. What is commonly left out is that it's better for developers to work on. Not for developers to target. I can say with massive certainty that Linux is far more effort to target, all while being a SIGNIFICANTLY lower market share. The reasons why are basically explained at length in points above, but seriously. Linux breaks more and has higher maintenance effort, while also having significantly worse tech.

If I only targeted a single OS, a Linux-exclusive version would still be much more effort to develop and maintain than a Windows-exclusive version. Now consider that Linux is a tiny marketshare compared to Windows and run the numbers on that. This is not for lack of familiarity, but simply due to worse underlying technology.

Look, I'm still committed to supporting Linux, but holy shit, can redditors stop putting their damn head in the sand about how broken parts of their favorite OS can be?

5

US Government Officially Recommends Signal - What This Means For You

If USG is pushing for citizens to use Signal than it means it's a US honeypot only. lol

9

Can You Name a Country, American?

OMG it can't be! How these dumbfuck Americans can be so bad educated?! No wonder people call them burger-eaters. They must be putting ignorance in ketchup lmao.

2

Young Americans can't RESPOND to STUPID QUESTIONS

Damn, these Americans are really dumb. Asinine level of ignorance. Send them in the mines. hahaha

1
Steam Deck? -Sucks!
  • I don't really understand why Valve went for a portable gaming handheld device with games made for PC with a wide screen. The fun of playing PC games is most likely due to mods and using a keyboard + mouse as a controller. Maybe Valve is predicting that in the future mobile gaming would be more successful than it is now, and frankly It is if one is to look at the global market share of gaming earnings, 60% and rising of profits comes from mobile gaming on phone and tablets. That's huge.

    So a risking choice from Valve with the Steam Deck. From the other side it's Linux, games could shut down support for it at any time without any repercussions. I've seen videos on YouTube of people connecting Steam Deck to an external monitor with Mouse + KB completely defeating the purpose of the device, that is being mobile and handheld. I definitely wouldn't buy one, a All-in-one PC would a much better choice for me in that price range. How can these people play on the run outside (on a bus or train) if the games in Steam are a time sink and require a lot of patience to play them? They need a PC, a desk, a chair in a quiet room.

  • Fuk Amerika

    soundcloud.com fuk america

    Listen to fuk america by manuu19rorodriguez #np on #SoundCloud

    fuk america
    0

    Amerika

    0

    André Rieu - O Fortuna

    1
    People across the world don’t care much about American Politics
  • The post I chose from Reddit it's simply an example, surely I endorse what it says and implies but frankly I, the same like the poster don't care much about American drama nor American foreign policies. I wasn't intentionally ignoring anything, that's your poor American mind trying to find some meaningless thing to justify your useless comment.

  • People across the world don’t care much about American Politics

    I’ve seen multiple times all over Reddit about American Politics, like now something about immigrants eating pets? 🤷 I’m sure I glanced at a news headline about Donald Trump saying that, but like most people I just scrolled right past it.

    Most of the world really doesn’t know or care about what’s going on in America, but most Americans I come across online think they are the centre of the universe 😂

    They are people dying all over the world in catastrophes that other countries are experiencing, from knife crime in England, immigrants sinking in boats near France, all the way to deadly floods in Poland and Austria. Do you really think people will care about delusional nonsense American Politicians say?

    8

    Linux Loser Distros

    0
    soundcloud.com Linux Still Sucks Shit

    Ex-worker. You people watch too many movies.

    Linux Still Sucks Shit
    1

    Major Linux Problems

    The original site hosting this article about Linux sucking is down, so I took the link from the Wayback Machine.

    Source © 2009-2023 Artem S. Tashkinov [(https://web.archive.org/web/20241212104642/https://itvision.altervista.org/why.linux.is.not.ready.for.the.desktop.current.html)]

    Summary

    No stability, bugs, regressions, regressions and regressions: There's a large number of regressions (both in the kernel and in user space applications) when things which used to work break inexplicably; some of the regressions can even lead to data loss. Basically there is no quality control (QA/QC) nor regression testing in most Open Source projects (including the kernel) - Microsoft, for instance, reports that Windows 8 received 1,240,000,000 hours of testing whereas new kernel releases get, I guess, under 10,000 hours of testing - and every Linux kernel release is comparable to a new Windows version. Serious bugs which impede normal workflow can take years to be resolved. A lot of crucial hardware (e.g. GPUs, Wi-Fi cards) isn't properly supported. Often regressions are introduced in "stable" x.y.Z kernel releases even though Linux developers insist such releases must be upgraded to immidiately. Hardware issues: Under Linux many devices and device features are still poorly supported or not supported at all. Some hardware (e.g. Broadcom Wi-Fi adapters) cannot be used unless you already have a working Internet connection. New hardware often becomes supported months after introduction. Specialized software to manage devices like printers, scanners, cameras, webcams, audio players, smartphones, etc. almost always just doesn't exist - so you won't be able to fully control your new gadgets and update firmware. Linux graphics support is a big bloody mess because kernel/X.org APIs/ABIs constantly change and NVIDIA/Broadcom/etc. companies don't want to allocate extra resources and waste their money just to keep up with an insane rate of changes in the Open Source software. The lack of standardization, fragmentation, unwarranted & excessive variety, as well as no common direction or vision among different distros: Too many Linux distributions with incompatible and dissimilar configurations, packaging systems and incompatible libraries. Different distros employ totally different desktop environments, different graphical and console applications for configuring your computer settings. E.g. Debian-based distros oblige you to use the strictly text based dpkg-reconfigure utility for certain system-related maintenance tasks. The lack of cooperation between open source developers, and internal wars: There's no central body to organize the development of different parts of the open source stack which often leads to a situation where one project introduces changes which break other projects (this problem is also reflected in "Unstable APIs/ABIs" below). Even though the Open Source movement lacks manpower, different Linux distros find enough resources to fork projects (Gentoo developers are going to develop a udev alternative; a discord in ffmpeg which led to the emergence of libav; a situation around OpenOffice/LibreOffice; a new X.org/Wayland alternative - Mir) and to use their own solutions. A lot of rapid changes: Most Linux distros have very short upgrade/release cycles (as short as six months in some cases, or e.g. Arch which is a rolling distro, or Fedora which gets updated every six months), thus you are constantly bombarded with changes you don't expect or don't want. LTS (long term support) distros are in most cases unsuitable for the desktop user due to the policy of preserving application versions (and usually there's no officially approved way to install bleeding edge applications - please, don't remind me of PPAs and backports - these hacks are not officially supported, nor guaranteed to work). Another show-stopping problem for LTS distros is that LTS kernels often do not support new hardware. Unstable APIs/ABIs & the lack of real compatibility: It's very difficult to use old open and closed source software in new distros (in many cases it becomes impossible due to changes in core Linux components like kernel, GCC or glibc). Almost non-existent backwards compatibility makes it incredibly difficult and costly to create closed source applications for Linux distros. Open Source software which doesn't have active developers or maintainers gets simply dropped if its dependencies cannot be satisfied because older libraries have become obsolete and they are no longer available. For this reason for instance a lot of KDE3/Qt3 applications are not available in modern Linux distros even though alternatives do not exist. Developing drivers out of the main Linux kernel tree is an excruciating and expensive chore. There's no WinSxS equivalent for Linux - thus there's no simple way to install conflicting libraries. In 2015 Debian dropped support for Linux Standard Base (LSB). Viva, incompatibility! Software issues: Not that many native games (mostly Indies) and few native AAA games (Valve's efforts and collaboration with games developers have resulted in many recent games being released for Linux, however every year thousands of titles are still released for Windows exclusively*. More than 98% of existing and upcoming AAA titles are still unavailable in Linux). No familiar Windows software, no Microsoft Office (LibreOffice still has major troubles correctly opening Microsoft Office produced documents), no native CIFS (simple to configure and use, as well as password protected and encrypted network file sharing) equivalent, no Active Directory or its featurewise equivalent. Money, enthusiasm, motivation and responsibility: I predicted years ago that FOSS developers would start drifting away from the platform as FOSS is no longer a playground, it requires substantial effort and time, i.e. the fun is over, developers want real money to get the really hard work done. FOSS development, which lacks financial backing, shows its fatigue and disillusionment. The FOSS platform after all requires financially motivated developers as underfunded projects start to wane and critical bugs stay open for years. One could say "Good riddance", but the problem is that oftentimes those dying projects have no alternatives or similarly-featured successors. Also, open source developers are often keen to rewrite applications instead of attending to old bugs. No polish, no consistency and no HIG adherence (even KDE developers admit it). Various Linux components are loosely connected vs. other desktop operating systems like Windows and Mac OS X which means the same tasks running on Linux will consume quite a lot more energy (power) and as a result laptop users running Linux have a worse battery life. Here are some examples from a normal daily life: editing documents, listening to music, watching YouTube videos, or even playing games. Another example will be a simple task of desktop rendering: whereas Windows uses GPU acceleration and scheduling for many tasks related to rendering the image on the screen, Linux usually uses none.

    This article is bollocks! Linux works for me/for my grandpa/for my aunt/etc.

    Hey, I love when people are saying this, however here's a list of Linux problems which affect pretty much every Linux user.

    Out of the box neither Mozilla Firefox nor Google Chrome use video decoding and output acceleration in Linux (which is a hell to set up in many cases), thus youtube clips will drain your laptop battery a lot faster than e.g. in Windows. Addendum: in 2022 hardware video decoding acceleration can be manually enabled in Firefox and Google Chromium (not Google Chrome) for Intel and AMD users with appropriate hardware. Keyboard shortcut handling for people using local keyboard layouts is broken (this bug is now 16 years old). Not everyone lives in English-speaking countries. This doesn't affect Wayland but Wayland has its own share of critical usability issues. Keyboard handling in X.org is broken by design - when you have a pop-up or an open menu, global keyboard shortcuts/keybindings don't (GTK) work (QT). This doesn't affect Wayland. There's no easy way to use software which is not offered by your distro repositories, especially the software which is available only as sources. For the average Joe, who's not an IT specialist, there's no way at all. You don't play games, do you? Linux still has very few native AAA games: for the past three years no new AAA titles have been made available. Most Linux games on Steam are Indies. To be fair you can now run thousands of Windows games through DirectX to Vulkan/OpenGL translation using DXVK however it's not perfect and anti-cheat protection usually doesn't work in Linux. People using Linux have been banned for playing multi-user Windows games because under Linux it's near impossible to verify that your environment hasn't been tampered with. Microsoft Office is not available for Linux. LibreOffice often has major troubles properly opening, rendering or saving documents created in Microsoft Office (alas, it's a standard in the business world). Besides, LibreOffice has a drastically different user interface and many features work differently. Also native Windows fonts are not available in Linux which often leads to formatting issues. Several crucial Windows applications are not available under Linux: Quicken, Adobe authoring products (Photoshop, Audition, etc.), Corel authoring products (CorelDraw and others), Autodesk software (3ds Max, Autocad, etc.), serious BluRay/DVD authoring products, professional audio applications (CuBase, SoundForge, etc.). In 2023 there's still no alternative to Windows Network File Sharing (network file sharing that is easily configurable, discoverable, encrypted and password protected). NFS and SSHFS are two lousy totally user-unfriendly alternatives. SAMBA is there but under many desktop environments there's no simple GUI to configure it. Linux doesn't have a reliably working hassle-free fast native (directly mountable via the kernel; FUSE doesn't cut it) MTP implementation. In order to work with your MTP devices, like ... Linux based Android phones you'd better use ... Windows or MacOS X. Android-File-Transfer-Linux works near perfectly but it's not included out of the box by most distros. Too many things in Linux require manual configuration using text files: NVIDIA Optimus switchable graphics, custom display refresh rates, multiseat setups, USB 3G/LTE/4G modems, various daemons' configuration, and advanced audio setups to name a few. Linux is secure UEFI boot mode unfriendly, if you're going to use any out of mainline tree drivers, e.g. NVIDIA, VirtualBox, VMWare, proprietary RAID, new Wi-Fi adapters, etc. etc. etc. This is a really bad situation which no Linux distro wants to address. A personal nitpick which might be very relevant nowadays: under XFCE/Gnome/KDE there's no way to monitor your BlueTooth devices battery level on screen at all times (e.g. using a systray applet). There are scripts like this but they are inaccessible for most people out there as they require console kung-fu and they may stop working at any time.

    Yeah, let's consider Linux an OS ready for the desktop :-).

    Applications and features sorely missing in Linux

    Here I don't even want to talk about Microsoft Office or pretty much all AAA games which are missing in Linux, I want to talk about crucial basic features of the desktop OS.

    Task Manager alternative: all task managers in Linux completely suck. The Windows Task Manager is capable of showing CPU/RAM/GPU/Disk utilization for each process and process group which allows to easily identify which applications slow your system down. It also has very nice overviews of CPU/RAM/GPU/IO/Disk activities which again are sorely missing in all Linux Task Managers. What's worse by default Linux does not aggregate performance stats of a process and its children (which is extremely important for web browsers) which makes understanding of how much resources your applications consume near impossible. It's possible to do that programmatically via cgroups but absolute most users will never do that. There's an app which tries to implement these features called "System Monitoring Center" but it's not installed by default in any major distros, secondly it's written in Python which is not the best language for such applications because it itself creates significant CPU/RAM pressure. Device Manager alternative: back when Corel Linux existed around 2000 it had a nifty device manager only no one has ever bothered to revive it. Under Linux it's impossible to get an easy to read overview of your devices, their properties, loaded drivers and whether they work correctly. There are multiple very user-unfriendly console utilities to do that which are inaccessible for most users out there. Not a single Linux distro that I'm aware of offers a user-friendly GUI program/utility/whatever which allows you to fix your boot issues. You're welcome to reinstall everything from scratch. Linux distros don't notify the user of kernel issues (dmesg) which are often a must to understand whether your system is functioning properly. I've written a bash script for that which you're welcome to use (must be put into rc.local or similar).

    Commentary From the Author

    A lot of people who are new to Linux or those who use a very tiny subset of applications are quick to disregard the entire list saying things like, "Audio in Linux works just fine for me." or "I've never had any troubles with video in Linux." Guess what, there are thousands of users who Linux car analogy. Click to view the full image have immense problems because they have a different set of hardware or software. Do yourself a favour - come and visit Ubuntu or Linux.com forums and count the number of threads which contain "I have erased PulseAudio and only now audio works for me" or "I have finally discovered I can use nouveau instead of NVIDIA binary drivers (or vice versa) and my problems are gone."

    There's another important thing that critics fail to understand. If something doesn't work in Linux, people will not care whose fault it is, they will automatically and rightly assume it's Linux's fault. For the average Joe, Linux is just another operating system. He or she doesn't care if a particular company ABC chose not to support Linux or not to release fully-functional drivers for Linux - their hard earned hardware just doesn't work, i.e. Linux doesn't work. People won't care if Skype crashes every five minutes under some circumstances - even though in reality Skype is an awful piece of software which has tonnes of glitches and sometimes crashes even under Windows and MacOS.

    I want to refute a common misconception, that support for older hardware in Linux is a lot better than in Windows. It's partly true but it's also false. For instance neither nouveau nor proprietary NVIDIA drivers have good support for older NVIDIA GPUs. Nouveau's OpenGL acceleration speed is lacking, NVIDIA's blob doesn't support many crucial features found in Xrandr or features required for proper acceleration of modern Linux GUIs (like Gnome 3 or KDE4). In case your old hardware is magically still supported, Linux drivers almost always offer only a small subset of features found in Windows drivers, so saying that Linux hardware support is better, just because you don't have to spend 20 minutes installing drivers, is unfair at best.

    Some comments just astonish me: "This was terrible. I mean, it's full of half-truths and opinions. NVIDIA Optimus (Then don't use it, go with Intel or something else)." No shit, sir! I've bought my laptop to enjoy games in Wine/dualboot and you dare tell me I shouldn't have bought in the first place? I kindly suggest that you not impose your opinion on other people who can actually get pleasure from playing high quality games. Saying that SSHFS is a replacement for Windows File Sharing is the most ridiculous thing that I've heard in my entire life.

    It's worth noting that the most vocal participants of the Open Source community are extremely bitchy and overly idealistic people peremptorily requiring everything to be open source and free or it has no right to exist at all in Linux.With an attitude like this, it's no surprise that a lot of companies completely disregard and shun the Linux desktop. Linus Torvalds once talked about this: There are "extremists" in the free software world, but that's one major reason why I don't call what I do "free software" any more. I don't want to be associated with the people for whom it's about exclusion and hatred.

    Most importantly this list is not an opinion. Almost every listed point has links to appropriate articles, threads and discussions centered on it, proving that I haven't pulled it out of my < expletive >. And please always check your "facts".

    I'm not really sorry for citing slashdot comments as a proof of what I'm writing about here, since I have one very strong justification for doing that - the /. crowd is very large, it mostly consists of smart people, IT specialists, scientists, etc. - and if a comment over there gets promoted to +5 insightful it usually* means that many people share the same opinion or have the same experience. This article was discussed on Slashdot, Reddit, Hacker News and Lobste.rs in 2017.

    • I previously said "certainly" instead of "usually" but after this text was called "hysterical nonsense" (a rebuttal is here) I decided not to use this word any more.

    On a positive note

    If you get an impression that Linux sucks - you are largely wrong. For a limited or/and non-professional use Linux indeed shines as a desktop OS - when you run it you can be sure that you are malware free. You can safely install and uninstall software without fearing that your system will break up. At the same time innate Windows problems (listed at the beginning of the article) are almost impossible to fix unless Microsoft starts from scratch - Linux problems are indeed approachable. What's more, Linux, unlike Windows 10, doesn't collect data on you and doesn't send it anywhere.

    Also there are several projects underway which are intended to simplify, modernize and unify the Linux desktop. They are NetworkManager, systemd, Wayland, file system unification first proposed and implemented by Fedora, and others. Unfortunately no one is working towards stabilizing Linux, so the only alternative to Windows in the Linux world is Red Hat Enterprise Linux and its derivative (CentOS).

    Many top tier 3D game engines now support Linux natively (with reservations): CryEngine, Unreal Engine 4, Unity Engine, Source Engine 2.0 and others.

    Valve Software released Steam for Linux and ported the Source engine for Linux and also they developed a Steam gaming machine which is based on Linux. Valve's efforts have resulted in a number of AAA game titles having been made available natively for Linux, e.g. Metro Last Light. Valve since then have ported a lot of their games to Linux.

    NVIDIA made their drivers more compatible with bumblebee, however NVIDIA themselves don't want to support Optimus under Linux - maybe because X.org/kernel architectures are not very suitable for that. Also NVIDIA started to provide certain very limited documentation for their GPUs.

    Linus Torvalds believes Linux APIs have recently become much more stable - however I don't share his optimism ;).

    Ubuntu developers listened to me and created a new unified packaging format. More on it here and here. Fedora developers decided to follow Ubuntu's lead and they're contemplating making the installation of third-party non-free software easy and trouble free.

    The Linux Foundation formed a new initiative to support critical Open Source Projects.

    An application level firewall named Douane has been graciously donated to the Linux community. Thanks a lot to its author!

    Starting March 2017 you can watch Netflix in Linux.

    In 2018 thanks to the DXVK project Linux gamers are now able to run DirectX 11 Windows games on Linux - Wine's own implementation is severly lacking and will probably be replaced with DXVK.

    In August 2018 Valve released Proton for Steam: this compatibility layer based on Wine, allows you to run native Windows games from the Steam catalogue in Linux without using any tricks with almost native speed. Its only drawback is that it requires a modern enough GPU which supports Vulkan.

    More and more games are now coded using the Vulkan API and they work just fine under Linux.

    In 2022, Valve released a Linux-based gaming handheld, Steam Deck that runs Windows games via Wine/DXKV (Proton) which ultimately means that developers are spurred to make their Windows game run flawlessly under "emulation".

    On May 11, 2022 NVIDIA released their Linux kernel driver as open source. At the time of release it is very incomplete and allows only to be used for NVIDIA GPUs in data centers as display driving bits are missing. This is a major development as it simplifies, streamlines and makes possible a ton of things not possible before, e.g. full secure boot, proper Optimus support, using GPL only symbols in the kernel, etc. etc. etc. Sadly only the latest two NVIDIA GPU architectures are supported: Ampere and Turing. The users of older GPUs still need to use the proprietary kernel module.

    DXVK (a DirectX 9-11 translation layer to Vulkan) has seen a major success recently thanks to Valve's sponsorship. Many Windows games run under Linux with little to no performance loss or in some cases even faster than under Windows. Intel has started to use it for its discrete GPUs.

    2