Sunday March 24th - Saturday March 30th 2024 - Work trip to Las Vegas - many miles of walking but no ‘walks’

What a week this has been.  At work I’ve been one of the more vocal team members advocating for a cautious and measured journey into the cloud hosted future.  For over a year it has been a rollercoaster of hearing that we were probably going to do a huge forklift style cloud migration then a few weeks later hearing we’d just stay at our current awful managed provider and then a few weeks later that we would probably buy and host our own server farm.  This decision cycle repeated a number of times and I did my best to be as blasé as I could be about it since it seemed like the process would go on forever. Then, with surprising rapidity, the decision was made and a lot of money spent buying our own equipment which we are going to operate out of a Las Vegas based datacenter that the West coast half of the pre-merger company has been using for a few years.

This week was the week when the parts finally all shipped (there were a lot of delays from HP) and the trip to implement it all was finally ready to be scheduled.  My co-worker and I would do everything except the networking equipment on our own.  26 servers in 4 different groups, each connected with 2 fiber optic network cables, 1 copper networking cable and 2 power cables. And EVERY cable got labeled with at least a serial number on both ends and all the fiber lines had a much more detailed set of information detailing the cable’s purpose and starting and ending points etc.  That felt like 1 million cables to unwrap, 2 million tiny sticky labels to type out and print, all with a stiff air conditioning breeze at about 67 degrees.  Then they all get carefully snaked through tight cable channels with hair dryer hot air blowing in your face.

This was my first exposure to a really big time data center. I’ve been in numerous data centers that were room sized, sometimes really large room sized.  But these were all relatively small sections of larger buildings and a break from the noise and the cold breeze was just a handful of steps away.  That didn’t really prep me for the reality of these airport sized data center farms.  The one we were in had 6 ‘sectors’ and each sector is seriously about the size of a medium sized BJ’s or Sam’s club warehouse store.  Maybe not Costco sized but pretty close.  And there are multiple of these giant conglomerations of these sectors scattered around one part of Vegas.

The list of things that make this a fascinating but very challenging work environment is very long:
  • The vast size of the facility means that leaving the data center floor to go to the bathroom or have even a drop of water to drink is a 3-5 minute walk.
  • The hot/cold aisle system means that the front of all the server racks are in a very cool carefully managed temperature and humidity controlled environment. It’s very noisy and we had only a very small folding table to work on.  Everything (racks, bins of cables, tools, supplies, our personal belongings etc) had to be secured each time we left the cage for more than a few minutes.  The rear of the racks exhaust into a horrifying metal chimney basically that is very cramped, hot and noisy.  And by hot I mean hot, probably close to 100 degrees.  
  • Water and food can only be consumed or left in 2 break rooms or, if you have reserved one, your tiny windowless office which is a solid 5 minute brisk walk from your cage.
  • Items (big and small) that arrive at the receiving area have to be COMPLETELY unpacked and all cardboard disposed of.  Absolutely no cardboard (or foam etc) is allowed out of the shipping area.  You also plug everything in and briefly power it on (smoke test) to reduce the chance of something unexpected happening in your cage.  Our project involved about 3 pallets of stuff to unpack plus quite a few smaller items.  We made many trips to the receiving area and each time you leave with a cart that you have to return promptly.  
  • The hot area in particular is so cramped that kneeling on the hard floor was the only practical way to stick your head into the hot racks and get to the wires you need to work on.  There is no flat counter or surface other than the floor so good luck looking at or taking notes on your laptop as you install wired.  This gets old in about 3 minutes and by the end of the 2nd day my legs were cramping, back aching etc etc.  Thankfully, my coworker took the brunt of the hot zone cabling work.  He had very high expectations for a picture perfect rack with pleasing geometrically perfect rows of cables etc.  I literally don’t think I can do that the way he would want and I certainly don’t feel a sufficient passion to do it to overcome the hatred for being in the hot box.  So I tried to almost all of the cable labeling which while not physically demanding, is very tedious.
From Monday to Thursday we had 10-13 hour days.  Honestly, as late as Thursday afternoon, it seemed like we just weren’t going to get it all done.  There were various snags each day.  No individual issue was huge but this really should have been a 2 trip project (maybe 4 days each visit) but we felt pressured into doing it in 1 week.  Thankfully we both travelled on Sunday (at the start) and Saturday (at the end) or we wouldn’t have made it.  Friday I actually got up at 5 am to join a Zoom call with one of our professional services people to get a jump on our final long push to get everything done before we wrapped at the end of our week.

It’s surprising to me how many people in IT, even in roles that seem similar to mine but use a completely different vocabulary to describe things. Lots of words in IT take on new meanings, some of which are almost exclusive to a data center…
  • Machine or system means different things based on context.  If you are setting up 100 new laptops for users, you might call them machines or systems.  If it’s servers and there are a lot of them, they might be machines or systems.  
  • Host usually refers to a machine (almost always a server class system) that runs or ‘hosts’ other virtual machines or other form of partitioned, containerized workloads unrelated to the host’s parent processes.
  • Racks are usually 19 or 23 inches wide (even in metric countries I think!) and contain either 2 (less common) or 4 posts or rails to mount various sliders and shelves etc in. It’s basically like a very rigidly standardized closet with rows of mounting holes (in a very specific repeating pattern) at each corner (or in the center of the rack from front to back in the case of 2 post racks).  Most racks are marked and planned out in units of space called, strangely enough, ‘units’ or just U.  The smallest servers or network devices are 1U high.  Some really small or really high density systems cram 2 tiny units side by side into a 1U tall enclosure but 1 U is only about 2 inches vertically so calling that space limited is an understatement.  A 1U server is about the width and height of a large pizza box (but potentially about 50% longer than wide) and that particular class of server is often called a pizza box server.  Writing all this out it sounds very silly but that’s how industry slang probably sounds to everybody who isn’t in that industry.
  • Runs or a run are passageways (either physically channeled or just designated sections of empty space in a rack or between posts etc) where cables come together so that a rack with 100 cables in it doesn’t have 100 cables spider webbing every which way but rather a number of ‘runs’ where (most often) cables from each server come over horizontally to join  either provided or improvised areas where multiple cables of the same type can ‘run’ neatly together to wherever they are going, usually vertically.  Typically most or all of the servers in a given rack are connecting to one or 2 network switches, hopefully in the same rack, but often several feet vertically away.  In IT speak a run cab be a verb ‘run those cables along the left side’ or a noun ‘count the number of cables in the run’.
There are probably a dozen or more frequently used terms related to putting equipment into data centers.  One of my favorites is ‘rack and stack’ which encompasses potentially all the steps necessary to get a server from its shipping container into a specific vertical position (measured in U’s) in a given rack.  While the size, shape and repeating pattern of the mounting holes on the racks is very standardized, the hardware that goes into the rack (usually empty at first then a server is slid into it) varies tremendously from vendor to vendor.  This is a source of much frustration and sometimes pinched or even knicked fingers or wrists since rack equipment invariably come with absolutely no instructions.  So it’s either a long detour to try and find documentation online or, more likely, a few frustrating minutes for the each of the first few examples of each kind of rail involved in the project.

The best aspects of the week were working, dining and commuting etc with my San Diego co-worker and as of a few weeks ago, manager. We think very similarly and seem to understand each other very well.  Most of our meals were really excellent and covered a huge range of cuisines.  Just for lunches we had Chinese, German, Hot Dog/taco shop, etc.  Dinners included a speakeasy themed eatery (very fun AND good food), mexican and a few very ordinary fast food and hotel dinners for various reasons.

I’m on my flight home now.  Window seat, Comfort+ and no middle seat passenger so, for a 4+ hour flight, this is pretty great.  I will be so happy to get home.  I’m so pleased we got the enormous project done at the datacenter.  I can’t say I look forward to returning.  I will come back and probably will need to several times in the next few years but unless I’m just there to do a half day of light work, big data centers seem about as uncomfortable, unpleasant and inhuman as any indoor space could possibly be.  This is in spite of a fantastic staff, immaculately clean conditions and top notch maintenance.  The 2 things that most surprised me about the entire data center experience were 1) just how unpleasant and unrelenting the physical environment was and 2) how much style the data center has.  What could have been a barren gray box is actually a really visually striking building where everything from the lobby, bathrooms and endless hallways all have a very intentional cool techno look.  It kills me that everybody expressly and repeatedly told not to photograph even the most innocuous pieces of the site.  So I have absolutely no pictures to share except a few from the escalator at Las Vegas airport.  I’m pretty sure you already know what a hotel room and a RAV4 look like and that was pretty much the only things left that I could photograph.



Comments

Popular posts from this blog

Sunday February 4th 2024 - Oak Rim/Power Pole/Explorer trails - Tug Hill state forest area near Rodman NY - 3 miles, just 223 feet elevation gain

July 27, 2024 - Canadian Cruise Vacation Day 8 - Bar Harbor, Maine