But this isn't the first time a tech exec has predicted the death of coding.
Jensen Huang says kids shouldn't learn to code — they should leave it up to AI.::At the recent World Government Summit in Dubai, Nvidia CEO Jensen Huang made a counterintuitive break with tech leader wisdom by saying that programming is no longer a vital skill due to the AI revolution.
I mean, we aren't exactly teaching kids how to hand calculate trig anymore. Sin, Cos, and Tan operations are pretty much exclusively done with a calculator and you'd be hard pressed to find anyone who graduated in the last 25 years who knows any other way to do it.
For a younger age range you might be right, but in general that's not true; the approximation via a Fourier series is definitely something we teach kids. We don't generally expect people to be able to actually calculate it at the speed of a calculator, sure, but at least it's tested whether they can derive the expansion.
I haven't graduated high school yet and even I know how to calculate sin and cos with the taylor series maclurin expansion. I am still in grade 11 and I assume they would be teaching it next year when I take my calculus class? Do they not teach it anymore?
Well, a lot of maths can be done with a calculator. They don't need to learn to actually understand the maths unless either they actually want to, or they're going into something like engineering.
This is objectively stupid. There are tonnes of things you learn in maths that are useful for everyday life even if you don’t do the actual calculations by hand.
In many engineering professions you really need to understand the underlying math to have a chance in hell to interpret the results correctly. Just because you get a result doesn't mean you get an answer.
Scientific calculators can do a ton of stuff, but they're all useless if you don't know anything about math. If you don't know anything about the subject, you can't formulate the right questions.
And that’s why people don’t understand that I’m not magic. Seriously, no you should know how to do math, understand how it works. Just like how as an engineer I need to understand how stories work.
Well, he's put the writing on the wall for his own developers. So, even if it isn't AI that writes them, the quality may well go down when those that can easily do so, leave for pastures new :P
Linus Torvals talk at the Aalto University.
Specifically a segment where he talks about how hard it is to work with Nvidia when it comes to the Linux kernel.
I thought coding skill is mostly about logical thinking, problem solving, and idea implementation instead of merely writing code?
Even then, who's gonna code to improve the AI in a meaningful way if everyone not learning to code? What if AI write their own update badly and no one correct it, and then the badly written AI write an even worst version of it? I think in biology we called that cancer.
Coding, like writing scientific papers, or novels, is only about randomly generating strings
See also, litigation, medical diagnoses, creating art that evokes an emotional reaction in its audience, etc.
It turns out that virtually all human advancement and achievement comes down to simply figuring out what the next most likely token is based on what's already been written.
It is, but you should note that the CEO of NVidia is a manager, and software developers haven't been able to sufficiently convey your point to managers for about 50 years, so we're certainly not going to get any better at it in the next few years.
Not with LLM's it won't. They're a dead end. In their rush for short term profits so called AI companies have poisoned the well; the only way to "improve" an LLM is to make it larger, but most of the content in the internet is now produced by these fancy autocomplete engines, so there's not only no new and better content to train them on, but since they can't really generate anything they haven't been trained on doing so on LLM generated text will only propagate and maximise any errors, like making photocopies of photocopies or JPEGs of JPEGs.
It's all a silly game of telephone now; a circular LLM centipede fed on its own excrement, distilling its own garbage to the point of maximum uselessness.
Even if AI were able to be trusted, you still need to know the material to know what you're even asking the AI for.
It's a ruler to guide the pencil, not the pencil drawing a straight line itself, you still have to know how to draw to be able to use it in a way that fits what you want to do.
Good for him. I like Nvidia and use one, but I have the rest of his company to thank for that.
I think for me it was a combination of:
< Name of person I don't know > says < big unhinged sweeping generalization > for < reason that makes no sense to anyone in the field >
My first instinct is not to click stuff like this altogether. I also think that anyone trying to preach what kids should or shouldn't do is already in the wrong automatically by assuming they have any say in this without a degree in pedagogy.
And who will code the code for ML/AI models ? I mean for Jr. Developers this is going to be a better way to learn than "did you Google it? " And maybe have precise answers to your questions. But it sounds more to me like "maybe you should buy more of our silicon".
Sounds a bit like "640kb is more than enough" oneliner. But let's see what it will bring.
"I have a foreboding of an America in my children's or grandchildren's time...when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide, almost without noticing, back into superstition and darkness..."
Carl Sagan,
Astrologist/Horposcopist from ancient times.
While large language models are impressive they seem to still lack the ability to actually reason which is quite important for programmer. Another thing they lack is human like intuition that allows us to seek solutions to problems with limited knowledge or without any existing solutions.
With the boom bringing a lot more money and attention to A.I the reasoning abilities will probably improve but until it's good enough we'll need people who can actually understand code. Once it's good enough then we don't really need people like Jensen Huang since robots can do whatever he does but better.
GPT4 (the preview) still produces code where it adds variables that it never uses anywhere... and when I asked one time about one variable, it was like, "Oh, you're right, let me re-write the code to put variable X into use", then just added it in a nonsensical location to serve a nonsensical purpose.
Yeah, tell kids not to learn how to code so that way they can't understand what your products actually do so you can claim plausible deniability to them that they aren't sucking up all your data like a hoover.
Bullshit. Even if AI were to fully replace is software developers (which I highly doubt), programming is still a very useful skill to learn just for the problem solving skills.
I use chatgpt for coding (millennial). You still need to know how to code though, because 50% of the time it doesn't work properly. You need to explain the nature of your variables, and the overall process you want to achieve. But I still save a good amount of time, because now I don't need to remember the specific syntax for a particular function, and it has saved me reading documentation because in can tell how some functions work by context.
Not learning how to code because of ai is like not learning math because there are calculators, sure, you don't need to know the multiplication tables by heart, but you need to know what multiplication is and how it's used to solve real world pringles.
i use chatgpt for coding (i can code myself but it helps with a lot of stuff), and if I wouldn't be able to code i would wonder why nothing works. but because i know how to code i know that chatgpt is often just writing horrible code which often does something completly else than asked. so i often think "screw this i do it myself" after countless trys to let chatgpt fix it.
Isn't this basically "CEO of AI hardware company says that more people should use AI"? Not really news, since you wouldn't really expect him to say otherwise.
I use LLMs daily to code but the more complex the issue is I try to solve the more work I have to do to get it to actually produce what I need. I feel like at some point we will get to where UML failed…it will just be easier to write the code.
But I don’t like writing long Linq queries or Angular templates or whatever, it does that quite well (70% of the time it is 70% correct or so). So it takes over the part of coding I dislike.
So no just being able to write code might be unnecessary but that’s like 10% of my day.
Writing single functions just isn't the hard part of programming in the vast majority of programs, the hard part is managing a project in a maintainable, robust, and extensible way.
I think my take is, he might be right. That is that by the time kids become adults we may have AGI and we'll either be enslaved or have much less work to do (for better or worse).
But AI as it is now, relies on input from humans. When left to take their own output as input, they go full Alabama (sorry Alabamites) with their output pretty quickly. Currently, they work as a tool in tandem with a human that knows what they're doing. If we don't make a leap from this current iteration of AI, then he'll be very very wrong.