A prominent open-source dev publishes their findings as to what's going on with Starfield's performance, and it's pretty darn strange.
According to Hans-Kristian Arntzen, a prominent open-source developer working on Vkd3d, a DirectX 12 to Vulkan translation layer, Starfield is not interacting properly with graphics card drivers.
The problem is so severe, in fact, that the aforementioned translation layer had to be updated specifically to handle Starfield as an exception to the usual handling of the issue.
"I had to fix your shit in my shit because your shit was so fucked that it fucked my shit"
This is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who's job it is to fix shit game code in the drivers. That's why (a) they're massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
It's interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don't look bad on benchmarks.
I'll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game.
The idea is that "the game dev knows best how to optimize for its specific usage" but in reality the game dev have no time to deal with hardware complexity and this is the result.
As far as I know that's what graphics drivers do, like, all the time. Every major title is handled specifically. I am not a developer. I heard this from engine developers
It's poorly optimized code, and the comments from the top brass has been "lol your PC sux" when they can't even get it running right on their own hardware.
It's not the variations of PC that's the issue, it's a design and quality control issue. Direct X and Vulkan are the bread and butter of PC gaming. Microsoft developed direct X to establish a common graphics framework for Windows and Microsoft game studio still fucked up working with it.
And you know to some extent, having a community help you with your games and find bugs is beautiful and probably pretty fucking cool for devs. But the fact is that the business side of things continues to put a sour taste in all of our mouths, devs included.
I really hope AI and the like push game devs out of big businesses and into self employment. Of all the types of people, I want problem solvers to have that life the most.
People actually need to stop doing Bethesda's work for them. Release after release they just push out buggy and unfinished product and community fixes it for them while they somehow take credit. FO76 was a huge mess exactly because people couldn't fix it. Bethesda is bad, and people need to see it as such. Paying full price for their products is downright insulting.
I'll play in a year after most of the bug and performance issues are fixed. Which seems like my typical response to any major game release these days; just wait a few months at first.
Armored Core VI and Baldur's Gate III are two big recently published games that do work quite well. They stand on the shoulders of two respectable companies.
Even before release I figured I'd wait for a sale. Too many good games just came out I want more, big backlog of Yakuza games I recently started and got totally hooked on. Not interested in helping standardize $70 games, will wait for a sale, and by then there will be a better mod scene too. Less money for a better game, win/win.
I've played it a little on Xbox since it's on gamepass and I haven't encountered any bugs, other than a single game crash. Is the PC release significantly worse than console?
Doesn't feel revolutionary but I'm enjoying it. Created Amos Burton and it's a pretty fun playthru so far.
Edit: Okay so let me correct that to replicatible crashes after xbox captures (both screenshots and recordings).
I've had 0 hard crashes but a few soft crashes since entering the final stretch of the MSQ. Sarah and Walter are stuck "talking" to each other permanently despite Sarah being in my ship and Walter the lodge. And if I try to talk to either of them the game locks up whenever it's time for the other npc to chime in and I have to reload. I also had a random soft crash where I couldn't enter the lodge from new Atlantis no matter what I did until I restarted the .exe(I'm thinking it's related to the convos bug I'm experiencing). Also the weird movement bugs like someone walking away from you during a convo or crew members floating in or through random places in my ship. Also have a flashing texture issue for a few seconds after accessing the inventories in the armory ship habs.
Outside that I'm getting 50-70 fps with mostly high settings at 1080p.
The issues are way overblown. I just bought a new car and with brand new tires and a few tweaks from my local repair shop I can go the speed limit now.
I love Starfield and has been playing it every day since launch. It runs like dogshit. Sure it doesn't stutter or anything but I can't, for the life of me, get the average FPS in outdoor areas to be anything higher than 70. 5800X + 3080 Ti. It doesn't matter how much I lower the setting, the CPU overhead is crazy.
it seems the current meta is to hate on starfield at the moment. I would suggest to keep playing and enjoying the game if you do and not to post about it.
only issue I see with the game at the moment is that they did not use those fly/land/dock sequences to mask the loading times. I think that would enhance the experience a lot
It really would have. Considering that my loading screens are scarcely longer than those sequences anyway it could have, should have been nearly seamless.
Honestly never knew there were people having performance issues. I haven't really gone to any communities discussing the game til now and the game runs fine on my PC.
I wonder if this has anything to do with not being able to load my saves. I went to mars and exited the game after a long gaming session. Came back the next day and I get a full system crash upon trying to load the exit save. Tried the autosaves, same deal. Tried my last normal save, same deal. Every once in about 5-6 full system crashes I can reload one of the saves from just landing on mars but if I try to enter caledonia then it's a full system crash. It's weird too, I can still hear the game running in a loop but I can tell there is no input and the graphics fully fail. Very frustrating. I finally got back to my main rig to be able to play and the game has just been straight not playable since about the day after it came out. Can't even get a hotfix from Bethesda. Bummer. I'll just have to wait to play it again. I'm not going to restart a new character just to run into the same thing.
Normally, I'd fully recommend that but I still want to play the game. I was actually really enjoying myself. I'll just wait until they actually issue a patch. I'm a little shocked not even one hotfix has gone out.
The crash on loading has bricked two of my characters now. I don't think I can be bothered again till they patch. One bricked in mars, the second bricked before I made it there. Waste io many hours.
That's where I'm at. I'm excited to be able to play again but I have to wait for however long it takes them to release a patch addressing this. I'm not mad about anything I've seen. But I literally can't play a game that I can't trust to save and be reloaded.
Are you playing on PC/Steam? Something weird happened to me yesterday where an exit save wouldn't load. Upon trying an earlier save, I found none would load (game crashed to desktop when selecting). I could not figure it out.
On a whim I verified files in Steam, which triggered a 30 MB update. After verification, my exit save worked fine. I'm at a loss for why core files would change.
Yup, steam with a Ryzen 5600x and rx 6800xt. I've tried verifying the game files a couple times. I'll give it another go in a bit. I'd love for there to be a change but I think I'm in that weird category that should work fine but it's just borked. The game ran great otherwise. Just trying to load saves and on the one attempt that actually works if I try to enter Caledonia it's an automatic I have to reboot my system.
Looks like Hans implemented a workaround in vkd3d-proton 2.10, using the open-source AMD vulkan driver on linux (RADV).
Device generated commands for compute
With NV_device_generated_commands_compute we can efficiently implement Starfield's use of ExecuteIndirect which hammers multi-dispatch COMPUTE + root parameter changes.
Previously, we would rely on a very slow workaround.
NOTE: This feature is currently only enabled on RADV due to driver issues.
I don't imagine it will take long for this to make its way into a Proton experimental release. Folks with AMD graphics who are comfortable with linux might want to give it a try.
I preferred the Little Mermaid, the Ugly Duckling, and of course the Emperor's New Groove, but his commentary on graphics in Starfield is also a compelling work.
Do we know for sure that the Starfield devs weren't able to figure out the problems with performance? I find often with companies, the larger they are, the more bureaucracy there is, and the more prioritization of tickets becomes this huge deal, where you even end up having meetings about how to prioritize tickets etc.
I would be surprised if the devs didn't know what was wrong already, I think it's more likely that management and higherups doesn't care about them fixing it right now.
Game devs have many teams all with different jobs, for a big game like this you'd typically have multiple teams dedicated to optimization in different areas (and between them). The specific problem in this case was how the game was communicating with graphics drivers (among others), which for any graphics heavy game is very fundamental to performance optimization. The problems aren't even an after-the-fact optimization sort of thing that teams should have to identify and follow-up on, batching jobs is standard practice when interacting with GPUs whether or not there's a translation layer.
When the devs of a core translation API between two supported graphics drivers that are commonplace in the gaming ecosystem have to write code to specifically fix issues with your application you've done something fundamentally wrong.
A lot of posts like these also seem to imply that the open source community should somehow be less competent than these companies and are surprised that the open source community can fix these issues. But the open source community has a ton of very respectable and extremely smart developers, it shouldn't be any surprise really.
To be even more direct: there's a huge overlap between the circles of "works in software dev" and "contributes to open source projects".
I really try to do different things at home than work, but I've definitely contributed fixes to game mods (why do so many modders fail to do null checks before trying to interact with short lived shit like projectiles?) and open source software I've needed to do stuff.
I'm amazed that Bethesda has one of the premier game developers in their stead in id Software and didn't bother to just use their shit. Instead they actively chased their staff away.
Bethesda the publisher and Bethesda the developer are different things.
The publishing arm seemed to know what they were doing, certainly enough for MS to buy them.
The developing arm is nothing if not consistent. You know what you're getting into. An RPG, with lots of character build possibilities (even if a particular build overpowered enough for 90% of players to accidentally stumble across it, like Skyrim's stealth archer build), a handful of memorable NPCs, no real character development, so-so performance, and a shitload of bugs.
If people are still buying them and still not enjoying them I don't know what to say. It's like watching Fast and Furious 10, and going "well that's fucking dumb".
I saw the ending of the last F&F by mistake (they sold us tickets for a movie that started an hour later), and let me tell you - that was fucking dumb.
ID and Bethesda Softworks and both using different custom, proprietary engines. Retraining your entire studio on a new engine is extremely time consuming, especially if it's a custom engine with limited learning materials, like ID tech. There's a big cost/benefit analysis there, and frankly, if Bethesda ever did switch engines, I think they'd be more likely to go with Unreal for this reason. Current staff, and certainly new hires, are much more likely to be familiar with it.
Well the Creation Engine and the ID Tech Engine follow two completely different main goals: One is build for wide open spaces and exploration with real time physics while also guaranteeing mod support. The other is build for fast paced combat in closed level structures.
And I think especially the mod support is important to Bethesda and its community. That's also the reason why so many people stick to Minecraft java instead of the more performant bedrock edition.
Just wish they would have incorporated the fixes into the game engine at some point. I bet some of the devs would have even signed away the code for free or at least very cheap. It was annoying not being able to use mods to fix bugs in Fallout 76 that were patched in Fallout and Elder Scrolls games some as far back as Morrowind. Sure they were mostly rare like being able to get pushed into the void behind what should have been solid meshes and the game engine seeming not to care as you fall endlessly or it crashed.
I'm so glad steam hired this guy cause if he was doing this sh*t to cover slack for Bethesda and the huge publishers all for just a personal side project I would lose any hope I had for humanity.
It's clearly not due to obsolete hardware. Not getting 60fps in New Atlantis while playing on a beast with 50-70% usage max points to optimization issues. I honestly don't know why those people think it's hardware
I dont understand how this is even an argument dude, bethesda has the worst reputation for this stuff. Literally every game they have released has been buggy as shit with terrible performance, but for some reason people just handwave it and say "its a bethesda game" when did they get so brainwashed, why is it acceptable for them??
It's also verified by Proton users noting a marked increase in performance with just a code commit. I'd urge anyone not to listen to this troll and go have a look.
This really isn't a good take when the "random guy" has provided proof, open source code demonstrating, and a relatively easy way to verify his claims (using his code).
It's all there out in the open if anyone has specific counter points, and this type of thing isn't an unusual situation with Bethesda developed games, or games on this engine.
I'm inclined to believe this, and this likely isn't even the whole extent of it. I've been playing on a Series X, but decided to check it out on my Rog Ally. On low, at 720p with FSR2 on, I'd get 25-30fps in somewhere like New Atlantis. I downloaded a tweaked .ini for the Ultra preset and now not only does the game look much better, but the city is up closer to 40fps, with most other areas being 45-60+. Makes me wonder what it was they thought was worth the massive cost that the default settings give, with no real visual improvement.
Another odd thing, if I'm playing Cyberpunk or something, this thing is in the 90%+ CPU and GPU utilization range, with the temps in the 90c+ range. Starfield? GPU is like 99%, CPU sits around 30%, and the temp is <=70c, which basically doesn't happen playing any other "AAA" game. I could buy Todd's comments if the frame rate was crap, but this thing was maxed out... but not getting close to full utilization on a handheld with an APU indicates something less simple.
I'm hoping the work from Hans finds its way to all platforms (in one way or another), because I'd love to use the Series X but 30fps with weird HDR on a 120hz OLED TV actually makes me a little nauseous after playing for a while, which isn't something I commonly have a problem with.
From my experience on the Steam Deck is doesn't matter if I run low graphics or medium graphics (some high settings) the performance is almost the same
I was able to install the DLSS mod which helped some but there's still performance issue even with using the DF optimized settings. I assume this will be fixed with driver and game updates but who knows how long that will take.
They should but its Bethesda. A company that misread the room thinking people making memes about how unoptimized their games are meant fans thought it was endearing rather than something deserving of mockery.
That's exactly what they're going to do because it costs nothing. The problem is they won't ever accept the community fixes into official updates since they're binary patches or file replacements.
Typical Bugthesda. Am only wondering how did they get this big by only releasing buggy products. I can't for the life of me remember a single product they have made that wasn't buggy mess that community fixed for them time and time again without any compensation. Not only that community didn't get any compensation, Bethesda tried to sell their work and pinch some more money.
Any serious bitching about SF seems to me to be nitpicking from folks who were just looking to bitch at Bethesda. It's a fantastic game with minor issues that are easily overlooked and don't really affect the experience.
Because Larian is relatively small compared to Bethesda and the game exceeded the already high expectations, it's a AAA D&D 5e game, which is something people were looking for for a long time. Larian deserves it, and they are actively fixing the game anyway. Bethesda has no excuses to be releasing games that have the types of bugs that they do after having such giant successes like Skyrim. They have the money.
I'm convinced large video game publishers make deals with graphics card manufacturers to force the end user to upgrade, the AMD and Nvidia deals are not for free access to new technology it's for which ever bids the highest price to sell more cards. There is little progression in graphics fidelity since 2016. We used to take giant leaps and now we take small insignificant steps.
Fidelity is always going to have diminishing returns. Perhaps there's something fishy going on in the video card business, I don't know that, but as someone who works in CGI, the evolution we see year after year makes sense, it's not like there's a hidden untapped potential
I'm not sure if it's such a direct conspiracy, but I'm sure some of this happens inadvertently at least. Developers of big budget games are likely going to target higher end hardware, and API usage that might cause problems on lower end hardware probably sneaks in as a result of that. I'm sure there's some deals between game studios and Nvidia / AMD to get the latest GPUs for workstations at some discount, which probably means the machines they're using for the bulk of development are beefier than the average consumer's (you also probably want a bit of headroom while developing)... But this kind of stuff can naturally lead to higher requirements for software because you don't run into performance issues unless you're very serious about testing on lower end hardware... Which you might care about to some extent, but it's an additional cost that can take away from other aspects of the game, which might make it less marketable (graphics are a big deal for marketing, for example).
Obviously it's not great if a game uses API calls inefficiently and that means it runs worse than it would otherwise... But I'm not really that surprised when it happens? Working on big projects on deadlines there's often a "try the obvious solution, worry later if it's too slow" mentality, and I'm not sure you need any more of a conspiracy than that to account for stuff like this.
It's the same trash engine they've used for 20 years. To be perfectly honest, they should put it in the ground and build a new one from scratch instead of pushing their Frankenstein engine along.
Unreal is older than their engine no? And everyone uses that...so what does this even mean?
The difference is that Epic barely makes games. They have their Fortnite which they can put in some minor effort to keep the money flowing and otherwise they can focus on the engine. Maybe with MS now being behind Bethesda they can also put in more work into their engine...maybe. We'll see.
Yeah you always have. They've been screwing modern graphics features to the old dog for years and hoping it'll continue to work. There's some serious limitations in it that another engine would be able to work through for a game like this. Seamless planet travel for one, and less abrupt loading.
I've had 1 full crash, and a good handful of NPCs running into walls or levitating through ceilings.
Performance is fine, I guess, but I got the game as part of a promotion while upgrading my graphics card so it had better be. I believe folks who say it runs like dog on hardware that's only a couple years old. It's apparently unplayable if installed on a hard disk instead of an SSD.
All in, it's the smoothest Bethesda launch I've ever seen (I skipped fallout 4, maybe it was better IDK) but that's honestly not saying much. It's way better than cyberpunk was at launch.
Once every 20 or so times that I leave my inventory, my viewcone is placed inside of my weapon for half a second and then the game stutters and I pop back into my character's head (I think the inventory screen may scale up weapons for display and it's failing to undo that so quickly, but that may be completely false).
That, and one dialogue "loaded" instantly (it started the interaction but wasn't prepared with the graphics) and displayed a black screen for the first half of the conversation. Oh, also, FSR is FSR and makes spaceship landings look terrible.
Those are the only notable graphics issues I've experienced aside from widespread poor performance, and they might not even be graphics issues. I mean, the game doesn't run too great, but the core gameplay is definitely less buggy than FO4 or Skyrim at launch. I'm sad to hear people are having more serious graphical issues, especially Arc users.