“Whatever happened with the ozone layer panic, if scientists are so smart?”
We listened to the scientists, and the problem went away.
Didn’t go away, just stopped getting worse at an alarming rate.
It’s the same as people using the example of the Y2K bug being a non event. Yeah, because globally trillions of dollars were spent fixing it before it became an event.
No
Get that marble brain Reddit-style bs outta here. If you wanna deny, you’re gonna have to come up with a reason that you could be right. Otherwise, we’re just gonna point al laugh at your dumbassery.
When you do things right, people won’t be sure you’ve done anything at all.
Y2K is similar. Most people will remember not much happening at all. Lots of people worked hard to solve the problem and prevent disaster.
Was there ever really a threat to begin with? The whole thing sounds like Jewish space lasers to me.
Edit: Gotta love getting downvoted for asking a question.
You’re probably getting down voted because you asked here instead of a search engine, and many people think it’s common knowledge, and it was already answered in this thread.
Sometimes an innocent question looks like someone JAQing off.
Sounds like a great way to keep people from interacting at all.
Similar with Y2K — it was only a nothingburger because it was taken seriously, and funded well. But the narrative is sometimes, “yeah lol it was a dud.”
I can’t remember the name but I think this is some kind of paradox.
Like the preventative measures we’re so effective that they created a perception that there was no risk in the first place.
The question is, what will happen in 2038 when y2k happens again due to an integer overflow? People are already sounding the alarm but who knows if people will fix all of the systems before it hits.
It’s already been addressed in Linux - not sure about other OSes. They doubled the size of time data so now you can keep using it until after the heat death of the universe. If you’re around then.
Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.
The issue wasn’t using the dates. The issue was the computer believing it was now on those dates.
I’m going to assume you aren’t old enough to remember, but the “only two digits to represent the year” issue predates computers. Lots of paper forms just gave two digits. And a lot of early computer work was just digitising paper forms.
I remember paper forms having “19__” in the year field. Good times
With an 8-bit signed integer counting from 1900…
Some of the computers in question predate standardizing on 8 bits to the byte. You’ve got a whole post here of bad assumptions about how things worked.