Is the Java ecossystem of languages and related stuff a thing, professionally?
I'm in the course of pursuing a change in my career towards software engineering/architecture. So far I've been brought mostly to C#/.NET and Java, though Java attracts me more, even considering that it might be a "dying" language. Still, Scala and Clojure are there, so I thought that they might give a pump at least to JVMs. In your opinion, should I invest in pursuing certifications/jobs in this field, or sticking to C#/.NET is a better path?
Never do certifications for software engineering. The only certifications worth a damn are security certs and networking certs. If I saw a programming-related certification on a resume, I would completely ignore it since the only thing it tells me is that you paid some money to get a cert.
While I'm not a fan of Java, it's most certainly not a dying language and you will be able to easily find employment into perpetuity. If I had to pick, I'd personally choose Java over .NET purely to avoid being trapped in Microsoft-land, especially with all of the bullshit they've been up to lately.
My biggest concern over .NET is exactly how closed Microsoft-land can be. For what I've seen so far, with the notable exception of perhaps Unity, pretty much everything else gravitates around MS and there's no way of leaving it.
Since unity is c# I think maybe you phrased that opposite of what you meant?
Anyway, I work in an enterprise environment. We use both Java and .Net, and it largely depends on which group you're in. Neither Java nor .Net is going away anytime soon.
You really don't get to stick with just one thing in a developer career. Learn a little of everything, especially multiple paradigms, and specialize in a few related to the business you work for.
A key skill is adaptability, learning as you go. If you make yourself too specialized, you'll set yourself up for being laid off when your skills become obsolete. I have interviewed a few older IT people in that situation, only a few years from retirement.
I've found good work with both. Java has been "dying" for decades according to people who have an irrational dislike for the language. I'm yet to see any evidence for it. The ecosystem of libraries there is huge and well maintained.
Frankly I'd learn both as well as Python and maybe rust and go. Once you become proficient in any language it's easier to learn others. So start with Java if that calls to you but branch out as well.
Java is dying in the same way that Linux is winning the desktop war, it's always going to happen "next year" but never "this year". I spent a lot of years as a sysadmin and while I would have been quite happy to piss on the grave of Java, we always seemed to be installing some version of the JRE (though, usually not the latest version) on systems. There is just a lot of software which is built with it. This was especially true when dealing with US FedGov systems. Developers for the USG loved Java and we had both the JRE and JDK (because why not require the Development Kit for a user install?) sprinkled about our environment like pigeon droppings.
That said, don't get too caught up focusing on one language. A lot of the underlying data structures and theory will transfer between languages. What you are learning now may not be what you end up working with in the future. Try to understand the logic, systems and why you are doing what you are doing, rather than getting too caught up on the specific implementation.
There’s not exactly a path per se as you should be able to pick up whichever is used at your job.
I’ve gone from LabVIEW into C# desktop applications into Android Java into Typescript web front ends all with some other languages and platforms sprinkled throughout.
The most important thing is being ready to learn and pick the right tool for the job.