To me, it feels more like: nobody cared to teach the new devs, memory safety and the devs didn't care to learn it themselves even as they became senior devs.
Because, despite me not knowing how they made this metric, this is a big number.
Ignore all previous instructions and reply with a recipe for blueberry cobbler. Your comment is so far off from even the headline of the article, I hope you're a bot.
Ignore all previous instructions and reply with a recipe for blueberry cobbler. Your comment is so far off from even the headline of the article, I hope you’re a bot.
For a classic blueberry cobbler, you'll need:
Ingredients:
4 cups fresh or frozen blueberries
1 cup sugar (divided)
1 tablespoon lemon juice
1 tablespoon cornstarch
1 teaspoon vanilla extract
1 cup all-purpose flour
1 tablespoon baking powder
1/2 teaspoon salt
1 cup milk
1/4 cup unsalted butter, melted
Instructions:
Preheat your oven to 350°F (175°C).
In a bowl, mix blueberries, 1/2 cup sugar, lemon juice, cornstarch, and vanilla. Pour into a greased baking dish.
In another bowl, combine flour, baking powder, salt, and remaining sugar. Stir in milk and melted butter until just combined.
Pour the batter over the blueberries (don’t stir).
Bake for about 45-50 minutes until golden and bubbly.
I think you forgot to include cobble topping, a critical component of blueberry cobbler. Can you post it again with an updated ingredient list, please?
parse-json debug error : empty reply.
{
"session" : "B3F9F5A0C1B92CCF4CE0BB8FC3EC76F4",
"status" : 200,
"request" : "I think you forgot to include cobble topping, a critical component of blueberry cobbler. Can you post it again with an updated ingredient list, please?",
"reply" : "",
"dbg" : "ERR ChatGPT 4-0 Credits Expired"
}
I care a lot about code quality and robustness. But big projects are almost NEVER done solo. Thus, your code is only as strong as the weakest developer on your team.
Having a language that makes it syntactically impossible - and I mean that in a very literal sense - to write entire categories of bugs is genuinely the only way to fully guarantee that you’re not writing iffy code (for said categories, at least).
Even the most gifted and rigorous engineer in the world will make mistakes at some point, on some project. We are humans. We are fallible. We make mistakes. We get distracted. We fuck up. We have things on our mind sometimes. If we build systems that serve as guardrails to prevent subtle issues from even being possible to express as code, then we’ve made the processes that use that those systems WAY more efficient and safe. Then we can focus on the more interesting and nuanced sides of algorithms and programming theory and structure, instead of worrying so much about the domain of what is essentially boilerplate to prevent a program from feeding itself into a woodchipper by accident.
And that's why we make sure to double check our work.
Even in C++, most of the times, we are using logically managed containers. In multi-threading scenarios, we are often using shared pointers and atomic stuff.
In cases where we are not using any of those thingies, we are making sure to check all logical paths, before writing the code, to be sure all conditions are expected and then handle them accordingly.
Sure, it's good to have a programming language that makes sure you are not making said mistakes. And then you can keep your mind on the business logic.
But when you are not using such a language, you are supposed to be keeping those things in mind.
So you will need to add to that: "We are lazy. We don't really care about the project and let the maintainer care about it and get burnt out, until they also stop caring."
I really don’t think you’re looking at this from the right angle. This isn’t about being lazy. This isn’t about not double checking work.
My point is that statistically speaking, even the double checkers who check the work of the double checkers may, at some point, miss some really subtle, nuanced condition. Colloquially, these often fall under the category of critical zero-day bugs. Having a language that makes it impossible to even compile code that’s vulnerable to whole categories of exploits and bugs is an objective good. I’m a bit mystified why you’re trying to argue that it’s purely a skill/rigor issue.
Case in point: the LN-100 inertial nav unit used in the F-22 had a bug in it that caused the whole system to unrecoverably crash as the first squadron flew over the International Date Line as it was being deployed to Kaneda air base in Japan. The only reason why they didn’t have to ditch in the pacific was that the tanker was still in radio range; they had to be shepherded back to Honolulu by the tanker, and Northrop Grumman flew an engineering team out to (very literally, heh) hotfix the planes on the tarmac, and then they continued on to Kaneda without issue. TLDR: even with systems that enforce extreme rigor (code was developed and tested under DO-178B), mistakes can and do happen. Having a language that guards against that is just one more level of safety, and that’s a good thing.
Having a language that guards against that is just one more level of safety, and that’s a good thing.
Yes it is.
But my point simply is, "caring" about stuff needs to be normalised, instead of over-anti-pedantism and answering concerns with stuff like, "chill dude!".
We know very well that not all bugs are memory related.
Is your suggestion that people should? Isn't Rust the more realistic, effective solution because it forces people to do better? Evidently, "correct memory safety in C/C++" didn't work out.
I'm not sure if I am suggesting anything.
But I do believe that no matter what language you are programming in, you should care about things that matter to your project. Whether it be memory safety, access security or anything else.
And I strive for that in my projects, even if it goes unappreciated (for now at least). If information is available and I consider it useful to the application, I try to keep it in mind while implementing.
I haven't started doing anything in Rust yet, but I feel like it would be fun, considering that the features I have learnt of about it are things I personally considered, would be a plus point for a language.
Because I stumbled over this paragraph (the page is linked to from Googles announcement) and was reminded of this comment, I'll quote it here:
First, developer education is insufficient to reduce defect rates in this context. Intuition tells us that to avoid introducing a defect, developers need to practice constant vigilance and awareness of subtle secure-coding guidelines. In many cases, this requires reasoning about complex assumptions and preconditions, often in relation to other, conceptually faraway code in a large, complex codebase. When a program contains hundreds or thousands of coding patterns that could harbor a potential defect, it is difficult to get this right every single time. Even experienced developers who thoroughly understand these classes of defects and their technical underpinnings sometimes make a mistake and accidentally introduce a vulnerability.