There is no "date" data type in COBOL. Dates are stored however the programmer wants, but usually numeric character strings
There's no "default" date, even if there were such a data type
Even if there were a default, 1875 would be a bizarre choice
That (obviously) doesn't mean Elon Musk is right. It just means that this explanation of it being some magical COBOL epoch value is wrong. What's more likely is that the Social Security database is very old and has a lot of iffy data in it.
My guess is that it contains everybody who has ever had a social security record, including all the duplicates, all the typos, and everything else. At some point there were probably hundreds of thousands of records that were transcribed from paper into a computer, and it was considered safer to keep the iffy data and make a plan to deal with it later, vs. remove someone from the database who should legitimately be there.
I would also imagine that the systems that take the records out of the DB probably have filters in place that remove the (known) bad records before they're used.
I get this feeling too, but then again can we blame them? With all the locked down tech these days you really have to get out of your way to learn. And in most cases it works well enough. Whereas people growing up between lets say the 1970-2000s had to muck around with their tech to get it to work. Thus learning the intricacies while using it.
Less of a generational problem, more of an educational one.
Selfish, badly educated grifters that got pushed into high offices can be of any age. Musk also didn't recognize SQL when he looked at it, which is arguably even more funny.
The actual payment system stops payments automatically at age 115 and requires manual verification to restart. The database that is being reported is not even a report of who is getting paid.
This is just dramatic, public evidence of the arrogance and incompetence of DOGE from down to his racist younglings.
For a while, I thought they would at least be good at technology. This episode shows that even that is not true.
How he chose this elite group of chuckleheads is an eyebrow raiser. Other than racism, they seem to have no credentials at all. I mean, on brand for this administration I guess.
There are many people who were born in developing nations during times of war who do not know their exact age. They usually do have an idea of a range though.
Jesus fucking christ the interns who have neither seen nor heard of COBOL have also not encountered the concept of a sentinel value used as a fallback/default.
Date time types have long since been based on a 64 bit number , at least in Linux. However the old 32 bit date time types are still there so older programs won’t break, and probably on emdpbedded systems.p. So it comes down to the apps: how many old apps or old embedded systems will still be around?
What do you expect? most of the guys in "DOGE" weren't even alive on 9/11
I'm a bit surprised that they still have something in COBOL, maintenance probably costs o fortune, good luck finding young COBOL devs
I'm ready to learn COBOL. I will take up the torch. If you know good places to start, let me know. Last time I looked into it it seems way more involved than running stuff like Python, Java, and C.
Conspiracy theory: They know this, but being able to claim to their followers, with official records to show for it, who know NOTHING about programming, is an easy, effective win for them. They can claim fraud to their gullible audience and now have records they can point to and say "LOOK! THEY'RE GIVING DEAD PEOPLE SOCIAL SECURITY MONEY!"
This is why you come up with a hypothesis before running an experiment and collecting data. Otherwise you can pick a pattern in the data to propose just the right hypothesis.
The issue isn't inherently age it's just time and experience, understanding different coding patterns and paradigms that have changed over the years etc. Even someone who's been coding every day from ages 14-20 can't have the same knowledge and experience as someone who's been working with software since the 90s or earlier. Granted, there will always be brilliant people who even when lacking experience are more talented and skillful than maybe the majority, but that is uncommon. I'm only in my late 20s. And I remember in college there was a huge diversity of skills, from "are you sure this career path is really a good idea for you?" To "holy hell how did you do all of that in one hackathon?" But even for those really smart folks, they aren't just going to inherently understand all the different ways to organize and structure code, all the conventions that exist, and more importantly why those methods and structures exist and the history that informed them. I'm not saying you need on the ground experience (although, I'd say many people do, as many people can't really internalize things without direct exposure), but there's just not enough time, literally, in the handful of years that is childhood and teenage years to absorb all that history.
Anyway, what I'm getting at is that, yes, I agree that the problem isn't inherently about being teenagers but I do think it's a valid criticism that it's kind of ridiculous to have such young folks leading this kind of project given it's literally impossible for them to have the same amount of experience as software vets. It's also valid that young people are capable of seeing things in very new ways, since they aren't weighed down by al that history. But that's why diversity is useful especially for such a monumental project as this.
I don't know how many teenage programmers you have interacted with recently, but they are generally just learning the basics, learning core concepts, experimenting, etc...
There is a huge gap between making small, sometimes very cool and creative even, projects and understanding a giant legacy codebase in a language that is not taught anymore. I mean, even university grads often have trouble learning legacy code, much less in COBOL.
You wouldn't say your average teenage cook could make a gourmet meal for a house of 50 people 😅 not a dis, just they haven't had the time to get to greybeard level yet
this is why, if they heavily modified the code in such a short time and they couldn’t understand it: it proves there was a previous data breach and they’re just installing the pre-written patches…
the smoking gun that i can’t explain to anyone
How many teens you think can actually read and understand legacy languages like FORTRAN and COBOL? Let alone a complex codebase written in them?
I studied COBOL a bit in college and it's not exactly hard to read short snippets if you understand other languages, but good luck wrapping your head around anything remotely complex and actually understand what it is doing without having someone who understands the language. Hell, 15-20 years on and multiple languages later, my eyes still cross trying to read and grok COBOL. The people supporting those old code bases get paid well for a reason ...
I'm familiar with a dozen or so teenage romhackers. Assembly is surely harder to get the big picture of than cobol, but they're making incredible changes to 30-yo video games.
I think it makes sense that people who don't have actual experience in making projects in a specific language won't be aware of details such as the value 0 being the default in a certain kind of field in a certain language which makes it a good flag for "data unknown".
This is not a problem specific of teenage programmers - it is natural for just about everybody to not really know the ins and outs of a language and best practices when programming with it, when they just learned it and haven't actually been using it in projects for a year or two at least.
What's specific to teenagers (and young coders in general) is that:
They're very unlikely to have programmed with COBOL for a year or two, mainly because people when they start tend to gravitate towards "cool" stuff, which COBOL hasn't been for 4 decades.
They haven't been doing software engineering for long enough to have realized the stuff I just explained above - in their near-peak Dunning-Krugger expertise in the software engineering field, they really do think that learning to program in a given language is the same as having figured out how to properly use it.
I think it makes sense that people who don’t have actual experience in making projects in a specific language won’t be aware of details such as the value 0 being the default in a certain kind of field in a certain language which makes it a good flag for “data unknown”.
The whole "COBOL's default date is 1875" thing is just a lie. COBOL doesn't even have a date type.
So the problem doesn't have anything to do with COBOL, someone just made it up
my brother taught me to code when i was 6, so at 19 i had 13 years of experience already. At 6 i was mostly doing simple stuff like qbasic, vb6, but still it adds up. I'm not saying I'm a great coder, not by a long shot, just that I was experienced as a teenager. I assume a lot of these teenagers are much better than i was.
I've been surprised multiple times by coworkers who don't know the significance of midnight January 1st 1970... We support an embedded Linux device, among other things...
I dont even program and i could've told them it was probably a placeholder or default value lol "durrrrrr lot of people in this database were born at the exact same time on the same day in the same year that predates electronic databases, gotta be fraud!!1!1!11"
2016-2020 was the age of too stupid to break everything. Now we're staring down the barrel of "The files are in the computer?" But the entire US government is the computer.
Not only do many important government systems ultimately rely on or make heavy use of COBOL...
So do many older private companies.
Like banks. Account balances, transactions.
Its actually quite a serious problem that basically nobody who needs to take seriously actually does.
Basically no one is taught COBOL anymore, but a huge amount of code that undergirds much of what we consider 'modernity' is written in COBOL, and all the older folks that actually know COBOL are retiring.
We're gonna hit a point where the COBOL parts of a system to be altered or maintained, and ... there just isn't anyone who actually knows how to do it.
Yeah, I’ve been tempted to try this route, but you’re really pigeonholing yourself. Even if there’s always wrk, I can’t imagine only working with cobol the rest of my career.
Even worse, the places still using this are very heavy in process, with many undocumented dependencies among many undocumented workflows and business processes. Modernizing COBOL is not a coding problem: it’s a mammoth project management, coordination, and paperwork project that also has a little bit of coding. And its not like you can write clean code, you need to write essentially the same tangled mess of accumulated changes over decades because there’s no way of knowing everything that might break
In my experience in the legacy world we have the isHighDate function which not only checks the constant, but also 5 other edge cases where the value isn't HIGH_DATE but should be treated as if it is.
More specifically, they didn't find anyone receiving social security who were 150 years old because they didn't prove that they were receiving anything as that's not the purpose of that database.
It's because that explanation isn't correct. The real deal is you just have entries without a death date, so if you ran a query this get super old ages as a result.
Note that isn't a database of payments or even people eligible for them, just a listing of 'everyone' with a SSN. There is a separate master death index. In the old days, wild west kind of stuff people would disappear so the death date would never get entered. Modern days every morgue and funeral home has to legally notify SS when someone dies, there is a specific form and major hell to pay if you don't do it.
Social Security numbers were first issued in 1937. You would have need someone to be over 110 in 1937 to have an age over 200. I think that it's a combination of birthdays entered wrong plus no official death date.
Also a lot of people between 110 and 150, so I'm sure there is a larger answer.
However, Social Security cuts off at 115, and they supposedly found like 10 million people older than that. Considering there are only ~50m people on Social Security, and the database they were searching wasn't even about current recipients, most people would conclude that there is likely an error in data, rather than immediately jump to fraud. Of course, ketamine is a hell of a drug and Elon is not most people.
As someone who is working on a project of recreating an enterprise application in a modern tech stack, the legacy code is hard to understand too.
We have something similar in that a ClaimClosedDate is defaulted to 01/01/1900 and if it has that date it means it’s not closed whereas now that would be a nullable field.
Jay Kuo is wirth worth following. I know people dislike substack, but a lot decent people publish there. Jay Kuo is good at breaking down the legal aspects of the mess we are seeing now
I share this sentiment, and I admit I am not entirly aware of the Substack situation tbh. The issue is that these people, often very good journalists need a platform that generates revenue for them. I am not American, but find that people like Kuo, Jennifer Rubin, Robert Reich and Joyce Vance give valuable insights on American politics, and "sound bites" that can be share. But they need lir financial support to do this.
COBOL doesn't have a date type. And there were "people" in that "list" that weren't just 150 years old, and they varied in ages.
The real answer is that the list that they're saying is people getting social security, isn't the list of people getting paid, just lists of random ages in the database, which ultimately means nothing.
1875 has never been an epoch anywhere, on any system. 1970 has. 1900 has. 0000 has. But 1875? No, it hasn't. And no where in the cobol spec does 1875 appear.
This is just propaganda. He already does enough wrong, you guys lying about it just makes everything else you say suspect.
A lot of people online are calling it a reference date, whatever that means. An epoch data doesn’t even make sense since there isn’t really a date time type. I can see reason to doubt, but that’s not relevant.
However it could be important to the app. Perhaps at some point they decided there needed to be a cutoff because anything older was bad data. OR perhaps back in the days where storage was extremely expensive it was important to save a byte for every row.
Even if the specific claim about 1875 is wrong, that doesn’t change anything. The reality is bad data exists, there doesn’t seem to be any indication of it being paid out, and the claim of fraud is assinine
I was intrigued by this article as 1865 isn't any epoch I've heard about and I didn't think COBOL really had a concept of an epoch (an epoch matters when you're counting milliseconds from zero, COBOL stores date/time info differently). I've been searching this morning and can only find the Wikipedia page mentioning that date - which is weird for an ISO standard that is 99% about date formatting.