Apple this week announced new iMac, Mac mini, and MacBook Pro models, and they are all available with a minimum of 16GB of RAM. Apple also announced...
About time. This also applies to their older models such as M2 and M3 laptops.
In the U.S., the MacBook Air lineup continues to start at $999, so there is no price increase associated with the boost in RAM.
The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that's surprisingly competitive with most modern Windows laptops.
Completely laughable. Literally had 16 GB of DDR3-1600 for my 2600K from 2011 that I handed down to a kid nephew for their first PC to tinker with. Hell, my local NAS has more than that...
We use windows PCs at work as software engineers now, but when I was training I used a MacBook Pro M1 with 16GB of RAM and that thing was incredibly performant.
I know it in vogue to shit in Apple, but they build the hardware and the software and they’re incredibly efficient at what they do and I don’t think I ever saw the beachball loading icon thing.
Now the prices they charge to upgrade the RAM is something I can get behind shitting on.
Windows - Fast (I have a beast), bloated, stupid command prompt (“Add-Migration”, capital letters really.), wants to spy on me.
Linux - Fast, a lot of work to get everything working as you would on Windows or Mac and I’m past those days, I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.
it’s almost like people make choices to suit their needs and there isn’t a single solution for everybody.
I wonder what the industry standard is for developers? Genuinely. I’ve heard it’s Max, but my company is all in on Microsoft, not really heard of companies developing on Linux. Which isn’t to say Linux doesn’t have its place, but I’m aware this place is insanely biased towards Linux.
Well enterprise software is either going to run on windows or Linux servers, so sounds like windows and Linux make good dev workstations.
My current work gives devs macs but we build everything for Linux so it's a bit of a nuisance. And Apple moving to arm made running vms basically impossible for a while, it's a bit better now.
Still a giant pain in the butt to have your dev environment not match the build environment architecture.
As a developer writing code who used windows to ssh to linux servers I would disagree. But of course it depends on the company and the nature of the work, just offering my experience
but I know some native versions suck absolute ass and force you to use the windows version via proton regardless. ETS/ATS and Cities Skylines 1 being my immediate personal examples.
I wonder what the industry standard is for developers?
The Stack Overflow developer survey (which has it's bias towards people who use Stack Overflow)... says 47% use Windows, 32% use Mac, and uh, Linux is split up by distro so it's hard to make sense of the numbers but Ubuntu alone is at 27%. (each developer can use multiple platforms so they don't add up to 100%)
My current Linux machine needed exactly zero config post install, and even stuff like the fingerprint reader is working, I'm using it instead of passwords in a terminal.
I can also play games pretty well, it's usually smoother and less buggy than on Windows.
I feel Linux is not a compromise for me anymore, Windows is fast becoming one though.
So I actually did it and wiped my Windows PC, nothing on there I needed to keep.
Set up Fedora and added the Nvidia Drivers.
Shut down for a few days and in my next boot I downloaded CoolerControl. Then my networking died and I’m at a loss as to what happened.
And people said it was just the same as using windows, yet me a massive nerd, software developer was stuck without ever having attempted to play games.
The chip and OS won't do shit when your ram is saturated by electron apps taking 800MB each. Maybe MacOS behaves better under very high memory pressure than windows does, but it doesn't mean it's okay to rip off consumers. That whole 8GB on mac = 16GB on windows has been bullshit all along, and is mostly based on people looking at the task manager and seeing high ram usage on windows (which is a good thing)
Apple does have a lot of vertical integration which allows first party stuff to function well and they work closely with a lot of their premium 3rd party software partners, but you try running an actual RAM hungry process like a local LLM model, for example, and all but the highest end latest edition MacBook Pro WILL shit the bed.