Quantcast

The Y2K+38 Crisis

It

Get the WebProNews Newsletter:
[ Technology]

In 2038 we’ll likely be weaving tales for our grandkids about how we used to instant message with paper notes instead of our brainwaves and how, when we were really little, stereos were once considered nice furniture pieces. They may be especially interested because, if Richard Wilcox is right, all the really important computers just dialed back to 1901.
The Y2K+38 Crisis
Couldn’t say Wilcox didn’t give us ample warning. This link is supposed lead to a post he made in 2003, describing how our 32-bit world is destined to crash on a level a bit more catastrophic than the trumped up worries of December 1999. Unfortunately, due to an unexpected surge from Reddit users recently discovering the post themselves, Earthlink has pulled the plug for exceeding monthly traffic allotment.

Here’s the cached copy.

Impending crises need at least a 30 year run-up—we knew there’d be an energy crisis for at least that long, right?—and it’s not too early to sound the alarms on this one. It may sound unlikely, we’re already Moore’s Law generations past 32-bit systems; just think what we’ll be running three decades. 

I generally preferred words to math and programming, so talk of “signed integers” and “time_t” is lost on me, but I did gather from Wilcox’s post that at eight seconds past 3:14 a.m., on January 19, 2038, most computers in the world will think it’s actually a quarter to 9 p.m. on December 13, 1901.

That’s a big, big problem, and apparently a more complicated one than what we faced in 2000.

“So, if all goes normally,” writes Wilcox, “19-January-2038 will suddenly become 13-December-1901 in every time_t across the globe, and every date calculation based on this figure will go haywire.  And it gets worse.  Most of the support functions that use the time_t data type cannot handle negative time_t values at all.  They simply fail and return an error code.  Now, most ‘good’ C and C++ programmers know that they are supposed to write their programs in such a way that each function call is checked for an error return, so that the program will still behave nicely even when things don’t go as planned.  But all too often, the simple, basic, everyday functions they call will ‘almost never’ return an error code, so an error condition simply isn’t checked for.”    
The Y2K+38 Crisis
Why won’t we beyond 32 bit by that time? Also included in the original post is an explanation about the expense of building new systems and how computer companies tend to rely on cheap, older technology for building basic new technology. In this case, some technology in use in 2038, might span all the way back to the 1970s.

Hey babe, dust off Dad’s old 8-track so we can share some Earth Wind & Fire magic with the grandkids.
 

The Y2K+38 Crisis
Top Rated White Papers and Resources
  • sampat

    Hi, isn’t it possible to chang definition of time_t to use a 64-bit ? this could solve problem for long time ?

    • Guest

      Weird…. does your suggestion means running part of the program on 32 bit and the rest on 64 bit. The whole program will need to be ported. But as already said….. its the legacy code…. billions and billions of lines….

      • Guest

        Oh My!
        billions and billions of lines….
        Oh My!

        Waaa.

        • Guest

          When Y2K was announced.. nothing happened.

          I just set my computer time to 9 jan 2039 for the heck of it, and guess what happened.

          “the certificate for the website has expired, would you like to continue using it”

          Wow I saw the whole universe collapsing in front of my eyes as I pressed “yes” and nothing else happened.

          people that write things like this just try to scare the entire industry and then they offer their services to make a buttload of money spending 35 seconds to fix the “biggest problem evahr”

          • Guest

            Now picture someone waiting to receive a very important shipment on Jan 21, 2039.

            BOOM! The system never serves up the order.

            THAT’S what’s at stake here. Not wether your computer can still run your favorite website or not.

    • http://www.squidoo.com/slow-computer-speed-up Guest

      It is. In fact on my linux machine time_t is already 64 bit long. The article is a little incorrect.

      • http://www.squidoo.com/slow-computer-speed-up Guest

        Sorry, this was a reply to
        http://www.webpronews.com/topnews/2009/01/08/the-y2k38-crisis#comment-49131

        • Guest

          Oh, darn, I was correct the first time and wrong the second time :)

    • Brian

      time_t is at 64 bit is default on Microsoft compilers (C++) – Which can cause problems in its own right if an int or long is used with legacy code. Currently we redefine to 32bit – and it becomes our kids problem – well they need to earn their inheritance….!

    • http://www.inconcertcc.com ichramm

      I think we havea lot of time from here to 2038 to solve this problem, isn`t it?

      Maybe you or me will be dead for that day.

      We passed from 16 bits to 64 bits in less than ten years.

      Well, thats all…

      Regards

      • Guest

        Well, ‘home PCs’ may have been 16-bit 10 years ago, but others were not…
        IBM 360 – 32-bit in 1965
        CDC Cyber – 60-bit in 1978
        DEC VAX – 32-bit in 1975 (Unix was moved to this in 1975…without updating the time_t definition)
        I could go on, but the point is, business upgrades take decades while the home market (as young as it is) grows very quickly.

  • Andy

    Or use a proper language like .Net, that goes up to Dec 31st 9999…

    • http://www.squidoo.com/slow-computer-speed-up Guest

      .NET isn’t a language, and C# (which I assume you meant by .NET) being a proper language is a highly questionable statement.

      • Guest

        So you doubt that C# is a proper language?! Then what is your profession? Just curious…

      • Guest

        C# isn’t the only language in .NET, there are multiple languages but all of the libraries are derived from C++

  • Guest

    rofl, it was a joke of new year ;)

  • Ando

    I’m a new coder, just 2 years out of college. It’s a pretty safe bet I’ll still be coding in 2038. But I’m not panicking. Nor am I ignoring the problem. Why?

    Because people tend to make a huge deal out of things like that. Have we really become so dependent on computers for our lives that we think the end of the world is coming just because our coding ancestors were short-sighted and made their date datatype in such a way as to cause problems years down the road?

    I mean, I code now on web pages with the full belief that in 5 years nothing I code will still be in use. Why wouldn’t they have believed the same thing? When languages like C and C++ were being created, do you think the creators really believed that by the year 2038 they’d still be used?

    Seriously, the world is not going to end. As for programs crashing – yeah, that might happen, but a couple of things make me not fear it as much as some. Firstly, that’s almost 30 full years away, and with all this time, I would imagine we’d find a fix for it as an industry, and even the most old-fashioned legacy sticklers will either have to suck it up and use it, or be ousted by angry stock-holders who rightly accuse them of apathy in the face of disaster.

    Secondly, let’s say a large percentage of systems do go down. We’ll have problems. The IT job market will become a gold mine as people sramble to re-build, and when the dust settles, a lot of people will be employed, smarter, and our systems will hopefully then be date-bulletproof, and the old legacy systems will be where they belong – in the *PAST*!

    We cling far too tightly to the past, just because it’s ‘easier that way’, rather than being willing to take a risk now and then and stay current. Maybe a little shakeup of the old ways is the best thing that could happen to some companies who are a little too set in their ways.

    I guess we’ll find out in 30 years, eh?

    • Bob Wilson

      I

    • Guest

      Well Ando … Like you said it so truly .. you are a 2 years old coder and you don’t have much experience on the field … When you say “Because people tend to make a huge deal out of things like that” … well it’s true and as a professionnal you should too … and yes we do are dependent of computers … not the guy who run’s it at home but business, hospital, school and so on.

      Another one you said: “I code now on web pages with the full belief that in 5 years nothing I code will still be in use” … well .. you are totally wrong … You think business will invest massive money on application that will last only 5 years … If it would be like that … a lot of business would go bankrupt … I have customers who still run application developed 10 years ago and they don’t need to invest more fund because the apps is doing the job. Another one run’s an apps in VB6 and a group of 40 + users work with that all day long … imagine … you were probably in diapers when VB6 was launched !

      So think before reacting … that’s the problem of your generation … You never take the time to think …

      I’m 42 and I had the privilege to live the rising of computers, developpers, the web … and would never trade that for anything … Don’t you never forget … experience is everything

      • Ando

        Firstly, I was *not* in diapers with VB6 was released. My first experience with non-DOS-based programming was with VB6. Yes, I admit I was a teenager at the time, but don’t condescend to me. I cut my programming teeth on GW-BASIC before Windows 95 was even released, so don’t lecture me about not knowing anything pre-Java or any such.

        And by the way, I didn’t say I was a 2-year coder, I said I’m 2 years out of college. I started learning to write programs in 3rd grade, so don’t treat me like a baby.

        I am very concerned for society at large, honestly. No, I don’t believe we should cut computers back out or anything, but we rely on them so much that all it takes is one common glitch and the sky is falling.

        I do think before reacting. I think people are too quick to panic. Like people who freak out at the prospect that in 3,000 years a huge comet or meteor might hit Earth. Or that even longer away our sun might coola nd expand and destroy the earth. Or that in 30 years our computers might fritz out.

        My whole point is not that these things don’t matter, it’s that you’ve got time to think up a possible solution. And my further point is that when it comes to the computers, maybe I was a bit hasty with the whole 5 years thing, but surely 30 years is long enough to move on to better systems, much less 60. If we’re still fighting off the collapse of COBOL after 60 years, surely it’s time to let the poor thing go. So it’s not reasonable to ask business to sink millions into upgrading every few years, I apologize for saying that. But I don’t apologize for asking for it once in a generation, or even 30 years. Let it go.

    • Guest

      I’m sorry guy, but you are talking like a kid who had no contact with the business world when y2k hit us with a problem caused by our coding ancestors in the 70s and 80s. It WAS a big deal when it hit… Especially for medium sized businesses like mine that still had an as400 at the heart of it’s financial center. What is even funnier is that I made the same noise you did in the days before y2k. I think part of the problem is that when business people talk about doom on this scale they mean financial impact, while young guys like you think we mean planes falling out of the sky and dystopia.

      • Guest

        People have all kinds of misconceptions about computers crushing! Sure for some it means their finances going to wrong accounts, and for some, just planes falling from the skies. Maybe our best bet is to run simulators, place ourselves there and see what we have to deal with now before then. I believe “statistics is lies” but most gadgets that we are so proud of today stand because of statistics and multiplexing! So lets get real and face the hard facts. Don’t build crappy bridges just becoz you will be dead when they collapse!

  • jason

    Can’t this be tested for now? Couldn’t we just set the system date to 2038 or whatever and watch the sparks fly?

    Just asking. I’m a programmer, but I didn’t actually read Wilcox’s article.

  • Guest

    I work for a major company in one of the financial industries and the truth is there is still a lot of COBOL code still in use that was written in the 70′s. I can honesty see where some of the code one of my teams writes today could be in use in 30 years. Guess I know what to tell my grandchildren to study!

  • Guest

    Either Windows or C, either way, just patch the kernel so that time_t starts instead of 1970, to 2010, or even 2030. I have to think that starting at 1970 is just a kernel issue. We are all used to patches by now. Just patch the kernel of your OS, or your programming language.

  • Dusani

    Here is my prediction – you can quote me on this in 30 or so years.

    The problem won’t get fixed, because it doesn’t need to be – we’ve got other problems ahead of us. But, in about 30 years, business will start to spring up with offering of some kind of middle-tier solutions which would map legacy 32-bit code to the latest n-bit platforms (64-bit? 128? more?). This solution will prove cheaper than reinventing the software again from scratch on the newest platforms, and this will be the solution most financial institutions will take – and whoever is making these middle-tier solutions will get LOOOAAAADDDEDDD.

    The this way to go might sound too weird right now, but i’m sure my iPod Touch would have sounded weird to someone 30 years ago :)

  • Guest

    There is COleDateTime for Microsoft users which uses a 64 bit time variable. I would assume that by 2038 we will be on to bigger and better things and may be into 128 or 256 bit computing. Using Moore’s law, Kryder’s Law and as a rough guide computing power should be at least 10X what it is today and hard drives (or storage of some form) should be approaching the petabyte and displays should be displaying in the neighborhood of 10 or 12 million pixels with 36-bit color.

    This problem will not affect home PCs. What it might affect is embedded systems using C/C++ that could possibly still be around in 30 years.

  • Guest

    On Windows based systems the period of 100 years that should be used can be set. Default is 1980 – 2079.

    • Guest

      Default for which windows? XP defaults to 1930-2029 I believe.

  • Guest

    Comments, comments, comments…

    yet nobody has said:
    “My project(s) are good.”

    “I will check my code, it will pass y2k+38
    by the end of the month”.

    “I vow to check this on any new code I make.”

    I say this only because last night I was coding in XNA (C#.Net) and discovered that when they put out the first 3 releases, they still haven’t corrected a basic matrix method….

    http://forums.xna.com/forums/p/6674/35269.aspx

    So yeah maybe we should be talking about perfecting our own code….

    • Guest

      No – People should keep goiong on like they are now, then I can come out of retirement and charge big bucks to fix legacy applications, like the COBOL people for Y2K.

      After that little hickup in the stockmarket, this is my new retirement plan.

  • Guest

    I should be so smart to be able to write code that works for 70 Years!

  • Guest

    all this mongering is useless… are you all suggesting that (by the time it’s 2038) that code written 60 years ago would still be in use?.. that’s like saying a VCR will still be the home-owner’s choice to watch movies on…

    • Guest

      I have to assume that your (inaccurate) analogy is facetious. We are currently using plenty of code from the 1970s. Even if we upgrade the hardware, the software won’t “automatically” start using larger numbers, anyways.

      • Guest

        By 2030, retrodecorecompilers will be in use to transform any windows program and data from any bit size to any other bit size. I say windows because there will be no other OS by 2025.

        I remember in the late 1980′s that programmers were going to be obsolete due to the introduction of drag and drop style modular program thingies (like Access et al) that anyone could use (including well trained parrots) and we all know how that turned out!

        But true, disaster awaits those who don’t plan ahead and insist on using old software for sentimental reasons (or laziness and cheapness). I know one company relying on 1998 software by one guy who didn’t get to see 2004 and currently supported by ONE person world wide.

        Some people don’t get it. If your business relies on some software, keep up to date or migrate if support and updates stop. Crunchy time cometh!

    • http://www.stonerscolony.com FaTe

      well said, If we are indeed still reliant on technology from this period or beyond in 2038 then I think everyone should turn right and slap the person next to them.

  • Guest

    Doubtful that quantum computers will suffer from this issue. More likely the “kill all humans” scenario.

    • Guest

      Quantum computers don’t solve this type of problem at all. You still have bit-widths for quantum computers, hence data types and data sizes are still just as relevant.

      They just do the math quicker.

  • Borg drone 3 of 10

    …you will be assimilated

  • http://tyrannogenius.blogspot.com Guest

    To the pooh-pooers, saying that by 2038 we will have advanced beyond the 32-bit Unix system, note this from the author:

    The problem with this kind of optimism is the same root problem behind most of the Year 2000 concerns that plagued the software industry in previous years: Legacy Code. Developing a new piece of software is an expensive and time-consuming process. It’s much easier to take an existing program that we know works, and code one or two new features into it, than it is to throw the earlier program out and write a new one from scratch.

  • JimL

    Now, the _timet is 64 bit. If we recompile the application using a newer version of the compiler, this will offset that end date quite a bit.

  • http://www.professional-mover.co.uk Professional-Mover London

    Can you see my page and what you think its ok with Seo google.

    http://www.professional-mover.co.uk

  • http://www.ssrichardmontgomery.com ron

    This whole thing does not worry me in the least, why?
    I will probably be DEAD by then….unless it affects the computer inside me keeping me alive AND sexually active or no point (grin)

  • http://andreasdevblog.blogspot.com Andreas Marschke

    What if i change the date settings on my Linux boxes to that partibular daate will they crash too and if they do how could i reset them?

  • Sarah Conner

    I say…

    Judgement Day is coming… prepare yourselves people!

    • http://www.diamondonnet.com/ Diamonds

      Not if we get hit by a comment on 2036 as predicted by scientists then we are fine, Oh wait, we’re screwed…

  • Guest

    i was working with a cross platform app worth 10k+ when a customer started complaining about crashes on startup. after trying to figure out the cause for a few days, the customer figured out it was because he’d set the clock forward 100 years by accident – sure enough i tried the same and BOOM on startup. I’m going back about 8 years here, but the point is that even tho you think it shouldn’t make a difference, it can.

  • Guest

    Is this why so much span comes through with a 2038 date on it???

  • http://www.ryumaou.com/hoffman/netgeek/ Network Geek

    Well, in 28 or 29 years, Office Space can be updated for a new generation of frustrated cubicle farm gophers.

    We do tend to repeat history, don’t we?

  • Guest

    This is fine for me…since I suspect by 2038 the machines will have taken over and will be in the middle of their plot to kill us all. So once the 32bit date bug hits it will be our chance to strike and take them down!

  • http://www.franportal.com/ Franchise Opportunities

    And I thought the 32-bit ipv4 issue was big.