Y2K the new millennium? NOT.

  • Thread starter Thread starter snafu
  • Start date Start date
S

snafu

Guest
Has anybody noticed that Y2K is not the new millennium? That's right, the year 2000 is the last year of the second millennium. The first year A.D. was 1, not 0, so the last year of any decimal grouping of years ends in a zero, e.g. 10, 100, 1000, 2000, etc.

Please don't use the Y2K bug as an argument in favor of this error; it's only a problem because of lazy programmers that insisted on using the two digit date format. It has nothing to do with the millennium change.

I would enjoy debating anybody with opposing viewpoints.

Cheers.

[This message has been edited by snafu (edited July 01, 1999).]
 
considering that we didn't start the calender with year zero then 100 years after year one is year 101
the millenium really changes to the new one jan 1, 2001

which is why the movie was called 2001 not 2000

but most social change impact is based on rolling the zeros over...

dZ
 
dz,

Exactly. Arthur C. Clarke would be proud of you.

The craze is really being driven by the media and business, both of whom should reasonably know better, but have an interest in pushing the issue.

Stand by for what'll happen later in 2000 - the new craze will be "The Real Millennium Rollover" and a whole new campaign to make money.

snafu
 
Please don't blame "lazy" programmers for the
Y2K problem. I've been programming for a living since 1967. We had to set up an inventory system for the Marine Corps PXs (about the size of a Wal-Mart). We had all of
8K to work with (your PC has 8 meg and up). Memory was so tight, we had to subtract one number from another, then check for a zero balace instead of comparing the numbers -- it cost EIGHT bytes less memory that way. Besides, we all thought the programs would be replaced in a few years, let alone 30. Management tried to "save" money by keeping the old programs around LONG after the bigger machines came in.
 
Oatka, I respectfully recant my "lazy" comment. However, you'll probably admit that it was at the very least non-foresighted of DOS/Windows programmers not to use 4 digit dates, especially in mission critical payroll and financial systems.

I began programming in the '80s and noticed erratic results when I played with date calculations myself, and I was just a dabbler in programming. It just seems that the big software companies were asleep at the switch.

Or were they? There's a lot of money in the upgrades. ;)
 
Most of the upgrades are free if I'm not mistaken. At least from Microsoft anyway.

While y2k is certainly being hyped quite a bit, there are potential problems. A local (to me) sewage treatment plant tested their y2k preparations and dumped a 1500' by 500' pool of crap on public road. Hopefully, this time next year, y2k supplies, such as guns, will be cheap.
 
Actually, most of the upgrades have cost companies and all levels of government billions of dollars, collectively.

Many have used the bug as impetus to replace their otherwise adequately performing software with Y2K compliant code.

My personal feeling is that while there will inevitably be some problems with date sensitive software, like payroll systems, there will not be widespread catastrophic problems - no deaths, etc.

And by the way, that sewage spill occurred during a Y2K test, but didn't result from a software failure. I think it was an engineer's error in not throwing a switch or something that caused the spill.
 
Snafu, you're correct about 2001, etc. But, like gun control, what do facts have to do with public excitement?

I read somewhere that back when Oatka started programming, the cost of a meg of memory was a million bucks, or ten million bucks--something like that. It's easy to see why they saved every bit of space they could; it made very good economic sense. With our 20/20 hindsight, and a meg of memory now costing around ten cents, it's a different story...
 
Back
Top