New Year's Eve Computer Glitch Locks Out 3000 Hotel Guests

New Year's Eve Computer Glitch Locks Out 3000 Hotel Guests

image

Guests at the Denver Tech Center Marriott Hotel were locked out of their rooms on New Year's Eve when a January 1 computer glitch disabled every key in the joint.

It was a busy New Year's Eve at the Denver Tech Center Marriott Hotel, where a crowd of revelers had gathered to celebrate the end of the old year and the beginning of the new. But things didn't really get hopping until just before the clock struck midnight, because that's when a computer glitch, apparently caused by the transition to the new year, smoked all the room keys, leaving the celebrating guests locked out of their rooms.

Predictably, the booze-fueled fun quickly turned to frustration and anger as the party-goers slowly came to realize that they weren't going to be able to throw up and pass out in the privacy of their own rooms. And throwing up and passing out there was, and some fisticuffs as well, which the police were called in to sort out. No serious injuries were reported.

Wasn't this supposed to happen about 12 years ago? It's not exactly hospitals going dark and planes falling out of the sky, but why would this sort of technological hiccup occur in 2012? Or, to put it another way, Y2K12?

The lockout ended at around 3 a.m., when the Marriott was finally able to bring its wayward computer under control. The hotel said all guests would be comped to make up for the inconvenience.

Source: MSNBC

Permalink

My guess: The software was programmed to recognize the month and the day, but not the year, so it thought time had turned back to 364 days before the previous evening, not one day after, so therefore none of the guests had checked in yet. Or something like that.

Dammit maynas why didnt you predict this! you could have saved 3000 people from being locked out!

Y2K twelve years late, how quaint.

imnotparanoid:
Dammit maynas why didnt you predict this! you could have saved 3000 people from being locked out!

Oh they did but they just didn't tell anyone.
Cause how would that be funny?

Y2K is like Duke Nukem Forever, in a sense.

Was several years late past its deadline, and turned out to be a big disappointment.

You think this is abnormal? I've had those keys screw up before...

Hmmm... Don't think this is quite as drastic as the iPhone alarm problem last year!? Anyone else get effected by that? When all the alarms on the iPhone failed to go off for a couple of days? Meant loads of people were late for work on new years!!

That might be the greatest thing I've read all year.

...wait...

Damn, I feel sorry for that franchise. I worked the desk at a hotel through college, and I had to deal with a couple of sold out, drunken revelry kind of nights. This must have been chaos.

Andy Chalk:
Wasn't this supposed to happen about 12 years ago? It's not exactly hospitals going dark and planes falling out of the sky, but why would this sort of technological hiccup occur in 2012? Or, to put it another way, Y2K12?

I can explain why it might happen in 2012.

An easy fix for Y2K was to subtract a set number of years from the main system time (ie set the main clock back 12 years).
The User Interface was fixed so it adds the years back when displaying/printing the date.

How many years (that was subracted) depended on the useful life left in the machine.

With the GFC many of these systems were not replaced as originally planned.

These kinds of stories always give me a big grin. Drunk people are silly.

TechNoFear:

Andy Chalk:
Wasn't this supposed to happen about 12 years ago? It's not exactly hospitals going dark and planes falling out of the sky, but why would this sort of technological hiccup occur in 2012? Or, to put it another way, Y2K12?

I can explain why it might happen in 2012.

An easy fix for Y2K was to subtract a set number of years from the main system time (ie set the main clock back 12 years).
The User Interface was fixed so it adds the years back when displaying/printing the date.

How many years (that was subracted) depended on the useful life left in the machine.

With the GFC many of these systems were not replaced as originally planned.

I think I remember some computers in the early aughties (specifically Windows) having a setting for interpreting the date. If the year was below 30 (the default which could be changed), it would interpret it as part of the 21st century.

So why would they turn it back 12 years instead of 30? Is it so the system doesn't glitch out when it encounter file made in 1913?

BlackWidower:
I think I remember some computers in the early aughties (specifically Windows) having a setting for interpreting the date. If the year was below 30 (the default which could be changed), it would interpret it as part of the 21st century.

Windows 1 was released in 1985 and was just a User Interface on DOS.

I think you mean 'windowing', where you added code to determine the century based on the year (if year <= 30 it is 2000, if > 30 then it is 1900 giving a 'window' of 100 years, 1931 to 2030).

I simplified for brevity, rather than going into the specifics of time calculations on industrial computer systems, RDCs and PLCs.

BlackWidower:
So why would they turn it back 12 years instead of 30? Is it so the system doesn't glitch out when it encounter file made in 1913?

Ummm....because there were no computers to make files in 1913?

In industrial computer systems (IPCs) can also use UNIX time as well (as 'standard' date time formats), UNIX time is the number of seconds since 01/01/1970, known as the 'epoch'. (UNIX time is very similar to UTC)

The big problem (in the late 90s getting ready for Y2k) was that when you tested 2 of exactly the same expensive industrial devices, one could be Y2K compliant, the other not. This was due to different batches of 'standard' computer 'chips' used inside them (ie EPROMs).

Instead of replacing every chip (in every device), you changed the one chip that all the other chips got their time from, for simplicity called the 'system clock'.

When you took years off the system clock you, in effect and/or addition, added years to the UNIX epoch (ie add 12 years so it counts seconds starting at 01/01/1982 instead of 1970).

Generally the clock was not adjusted more than required to fix the problem, without breaking anything else (ie no file etc got a negative UNIX time).

When I reverse engineered the data capture computers on GE's DASH series locomotives I found those programmers decided on adding 17 years, so I expect to see more of these type of stories.

...so the Y2K bug was actually plausible. Not exactly knocking planes out of the sky, but it's a start.

After sobering up those people are going to feel pretty stupid for starting a fight because they couldn't get into a Marriot room.

Mama bear: "why the black eye baby bear?"
Baby bear: "I couldn't get into my room in the Marriot so got angry and punched a dapper penguin"
Mama bear: "RARRGHH!! THE MARRIOT?"
*Mama bear eats Baby bear*

Remember people, everytime you book into a Marriot, a bear eats their young. Don't book into a Marriot.

So what sort of mischief did the revellers at the Ramada Inn get up to?

I remember a few of my friends talking about being there. Apparently it was quite a party.

Sounds kind of like what should have happened with Y2K.

Steve the Pocket:
My guess: The software was programmed to recognize the month and the day, but not the year, so it thought time had turned back to 364 days before the previous evening, not one day after, so therefore none of the guests had checked in yet. Or something like that.

I think that's a good guess. I worked at a hotel for 3 years and ALWAYS ALWAYS ALWAYS had these issues on New Years, so much so that we left a note on each door telling the patrons to come down to the desk the next day to get a reissue if their cards failed.

Each hotel is operated somewhat independently by the franchisee. This wasn't a problem in other Marriott hotels (I know because I was in one on New Year's Eve). Sounds like somebody at that hotel screwed up huge and may have some answering to do to corporate HQ.

TechNoFear:

BlackWidower:
I think I remember some computers in the early aughties (specifically Windows) having a setting for interpreting the date. If the year was below 30 (the default which could be changed), it would interpret it as part of the 21st century.

Windows 1 was released in 1985 and was just a User Interface on DOS.

I think you mean 'windowing', where you added code to determine the century based on the year (if year <= 30 it is 2000, if > 30 then it is 1900 giving a 'window' of 100 years, 1931 to 2030).

I simplified for brevity, rather than going into the specifics of time calculations on industrial computer systems, RDCs and PLCs.

BlackWidower:
So why would they turn it back 12 years instead of 30? Is it so the system doesn't glitch out when it encounter file made in 1913?

Ummm....because there were no computers to make files in 1913?

In industrial computer systems (IPCs) can also use UNIX time as well (as 'standard' date time formats), UNIX time is the number of seconds since 01/01/1970, known as the 'epoch'. (UNIX time is very similar to UTC)

The big problem (in the late 90s getting ready for Y2k) was that when you tested 2 of exactly the same expensive industrial devices, one could be Y2K compliant, the other not. This was due to different batches of 'standard' computer 'chips' used inside them (ie EPROMs).

Instead of replacing every chip (in every device), you changed the one chip that all the other chips got their time from, for simplicity called the 'system clock'.

When you took years off the system clock you, in effect and/or addition, added years to the UNIX epoch (ie add 12 years so it counts seconds starting at 01/01/1982 instead of 1970).

Generally the clock was not adjusted more than required to fix the problem, without breaking anything else (ie no file etc got a negative UNIX time).

When I reverse engineered the data capture computers on GE's DASH series locomotives I found those programmers decided on adding 17 years, so I expect to see more of these type of stories.

Wow, you're pretty thick aren't you. When I was referring to Windows, I literally meant Microsoft Windows. The Microsoft Windows operating system back in the early aughties, likely Windows 2000 or ME, or maybe even Windows 98, had the feature I was referring to. Now it probably is called windowing, but I wasn't talking about that.

And as for files made back in 1913. I know none existed, that was my point. It was sarcasm, apparently something that was lost on you. Why turn it back only 12 years when there were no files from 1913 or 1929. By 2030 they would upgrade all their systems so it would be a non-issue, one would hope.

So you're saying the reason they only turned it back 12 years was because it operated on Unix time. There we go, that's good to know. That makes sense to me. However, I didn't know systems running on Unix time ever had to worry about the Y2K bug or anything similar until 2038.

Though it's a moot point now that we're switching to 64-bit...slowly...eventually...

How did it take them so long to upgrade their systems? That was only supposed to be a stop-gap measure, wasn't it? Didn't they know that!?

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here