Replication with an EqualLogic SAN; Part 1

 

Behind every great virtualized infrastructure is a great SAN to serve everything up.  I’ve had the opportunity to work with the Dell/EqualLogic iSCSI array for a while now, taking advantage of all of the benefits that the iSCSI based SAN array offers.  One feature that I haven’t been able to use is the built in replication feature.  Why?  I only had one array, and I didn’t have a location offsite to replicate to.

I suppose the real “part 1” of my replication project was selling the idea to the Management Team.  When it came to protecting our data and the systems that help generate that data, it didn’t take long for them to realize it wasn’t a matter of what we could afford, but how much we could afford to lose.  Having a building less than a mile away burn to the ground also helped the proposal.  On to the fun part; figuring out how to make all of this stuff work.

Of the many forms of replication out there, the most obvious one for me to start with is native SAN to SAN replication.  Why?  Well, it’s built right into the EqualLogic PS arrays, with no additional components to purchase, or license keys or fees to unlock features.  Other solutions exist, but it was best for me to start with the one I already had.

For companies with multiple sites, replication using EqualLogic arrays seems pretty straight forward.  For a company with nothing more than a single site, there are a few more steps that need to occur before the chance to start replicating data can happen.

 

Decision:  Colocation, or hosting provider

One of the first decisions that had to be made was if we wanted our data to be replicated to a Colocation (CoLo) with equipment that we owned and controlled, or with a hosting provider that can provide native PS array space and replication abilities.  Most hosting providers use a mixed variety of metering of data replicated to charge.  Accurately estimating your replication costs assumes you have a really good understanding of how much data will be replicated.  Unfortunately, this is difficult to know until you start replicating.  The pricing models of these hosting providers reminded me too much of a cab fare; never knowing what you are going to pay until you get the big bill when you are finished.    A CoLo with equipment that we owned fit with our current and future objectives much better.  We wanted fixed costs, and the ability to eventually do some hosting of critical services at the CoLo (web, ftp, mail relay, etc.), so it was an easy decision for us.

Our decision was to go with a CoLo facility located in the Westin Building in downtown Seattle.  Commonly known as the Seattle Internet Exchange (SIX), this is an impressive facility not only in it’s physical infrastructure, but how it provides peered interconnects directly from one ISP to another.  Our ISP uses this facility, so it worked out well to have our CoLo there as well

 

Decision:  Bandwidth

Bandwidth requirements for our replication was, and is still unknown, but I knew our bonded T1’s probably weren’t going to be enough, so I started exploring other options for higher speed access.  The first thing to check was to see if we qualified for a Metro-E or “Ethernet over Copper” (award winner for the dumbest name ever).  Metro-E removes the element of T-carrier lines along with any proprietary signaling, and provides internet access of point-to-point connections at Layer 2, instead of Layer 3.  We were not close enough to the carriers central office to get adequate bandwidth, and even if we were, it probably wouldn’t scale up to our future needs.

Enter QMOE, or Qwest Metro Optical Ethernet.  This solution feeds Layer 2 Ethernet to our building via fiber, offering the benefit of high bandwidth, low latency, that can be scaled easily.

Our first foray using QMOE is running a 30mbps point-to-point feed to our CoLo, and uplinked to the Internet.  If we need more later, there is no need to add or change equipment.  Just have them turn up the dial, and bill you accordingly.

 

Decision:  Topology

Topology planning has been interesting to say the least.  The best decision here depends on the use-case, and lets not forget, what’s left in the budget. 

Two options immediately presented themselves.

1.  Replication data from our internal SAN would be routed (Layer 3) to the SAN at the CoLo.

2.  Replication data  from our internal SAN would travel by way of a VLAN to the SAN at the CoLo.

If my need was only to send replication data to the CoLo, one could take advantage of that layer 2 connection, and send replication data directly to the CoLo, without it being routed.  This would mean that it would have to bypass any routers/firewalls in place, and have to be running to the CoLo on it’s own VLAN.

The QMOE network is built off of Cisco Equipment, so in order to utilize any VLANing from the CoLo to the primary facility, you must have Cisco switches that will support their VLAN trunking protocol (VTP).  I don’t have the proper equipment for that right now.

In my case, here is a very simplified illustration as to how the two topologies would look:

Routed Topology

image

 

Topology using VLANs

image

One may introduce more overhead and less effective throughput when the traffic becomes routed.  This is where a WAN optimization solution could come into play.  These solutions (SilverPeak, Riverbed, etc.) appear to be extremely good at improving effective throughput across many types of WAN connections.  These of course must sit at the correct spot in the path to the destination.  The units are often priced on bandwidth speed, and while they are very effective, are also quite an investment.  But they work at layer 3, and must in between the source and a router at both ends of the communication path; something that wouldn’t exist on a Metro-E circuit where VLANing was used to transmit replicated data.

The result is that for right now, I have chosen to go with a routed arrangement with no WAN optimization.  This does not differ too much from a traditional WAN circuit, other than my latencies should be much better.  The next step if our needs are not sufficiently met would be to invest in a couple of Cisco switches, then send replication data over it’s own VLAN to the CoLo, similar to the illustration above.

 

The equipment

My original SAN array is an EqualLogic PS5000e connected to a couple of Dell PowerConnect 5424 switches.  My new equipment closely mirrors this, but is slightly better;  An EqualLogic PS6000e and two PowerConnect 6224 switches.  Since both items will scale a bit better, I’ve decided to change out the existing array and switches with the new equipment.

 

Some Lessons learned so far

If you are changing ISPs, and your old ISP has authoritative control of your DNS zone files, make sure your new ISP has the zone file EXACTLY the way you need it.  Then confirm it one more time.  Spelling errors and omissions in DNS zone files doesn’t work out very well, especially when you factor in the time it takes for the corrections to propagate through the net.  (Usually up to 72 hours, but can feel like a lifetime when your customers can’t get to your website) 

If you are going to go with a QMOE or Metro-E circuit, be mindful that you might have to force the external interface on your outermost equipment (in our case, the firewall/router, but could be a managed switch as well) to negotiate to 100mbps full duplex.  Auto negotiation apparently doesn’t work to well on many Metro-E implementations, and can cause fragmentation that will reduce your effective throughput by quite a bit.  This is exactly what we saw.  Fortunately it was an easy fix.

 

Stay tuned for what’s next…

Using OneNote in IT

 

It’s hard to believe that as an IT administrator, one of my favorite applications I use is one of the least technical.  Microsoft created an absolutely stellar application when they created OneNote.  If you haven’t used it, you should.

Most IT Administrators have high expectations of themselves.  Somehow we expect to remember pretty much everything.  Deployment planning, research, application specific installation steps and issues.  Information gathering for troubleshooting, and documenting as-built installations.  You might have information that you work with every day, and think “how could I ever forget that?” (you will), along with that obscure, required setting on your old phone system that hasn’t been looked at in years.

The problem is that nobody can remember everything. 

After years of using my share of spiral binders, backs of print outs, and Post-It notes to gather and manage systems and technologies, I’ve realized a few things.  1.)  I can’t read my own writing.  2.)  I never wrote enough down for the information to be valuable.  3.)  What I can’t fit on one physical page, I squeeze in on another page that makes no sense at all.  4.)  The more I have to do, the more I tried (and failed) to figure out a way to file it.  5.)  These notes eventually became meaningless, even though I knew I kept them for a reason.  I just couldn’t remember why.

Do you want to make a huge change in how you work?   Read on.

OneNote was first adopted by our Sales team several years ago, and while I knew what it was, I never bothered to use it for real IT projects until late in 2007, when a colleague of mine (thanks Glenn if you are reading) suggested that it was working well for him and his IT needs.  Ever since then, I wonder how I ever worked without it.

If you aren’t familiar with OneNote, there isn’t too much to understand.  It’s an electronic Notebook. 

image

It’s arranged just as you’d expect a real notebook.  The left side represents notebooks, the top area of tabs represent sections or earmarks, and the right side represents the pages in a notebook.  It’s that easy.   Just like it’s physical counterpart, it’s free-form formatting allows you to place object anywhere on a page (goodbye MS Word).

What has transpired since my experiment to use OneNote is how well it tackles every single need I have in information gathering and mining of that data after the fact.  Here are some examples.

Long term projects and Research

What better time to try out a new way of working on one of the biggest projects I’ve had to tackle in years, right?  Virtualizing my infrastructure was a huge undertaking, and I had what seemed like an infinite amount of information to learn in a very short period of time, under all different types of subject matters.  In a Notebook called “Virtualization” I had sections that narrowed subject matters down to things like ESX, SAN array, Blades, switchgear, UPS, etc.  Each one of those sections had pages (at least a few dozen for the ESX section, as there was a lot to tackle) that were specific subject matters of information I needed to gather to learn about, or to keep for reference.  Links, screen captures, etc.  I dumped everything in there, including my deployment steps before, during, and after.

 

Procedures

Our Linux code compiling machines have very specific package installations and settings that need to be set before deployment.  OneNote works great for this.  The no-brainer checkboxes offer nice clarity.

image

If you maintain different flavors of Unix or various distributions of Linux, you know how much the syntax can vary.  OneNote helps keep your sanity.  With so many Windows products going the way of Powershell, you’d better have your command line syntax down for that too.

This has also worked well with backend installations.  My Installations of VMware, SharePoint, Exchange, etc. have all been documented this way.  It takes just a bit longer, but is invaluable later on.  Below is a capture of part of my cutover plan from Exchange 2003 to Exchange 2007.

image

Migrations and Post migration outstanding issues

After big migrations, you have to be on your toes to address issues that are difficult to predict.  OneNote has allowed me to use a simple ISSUE/FIX approach.  So, in an “Apps” notebook, under an “E2007 Migration” section, I might have a page called “Postfix” and it might look something like this.

image

You can label these pages “Outstanding issues” or as I did for my ESX 3.5 to vSphere migration, “Postfix” pages.

image

As-builts

Those in the Engineering/Architectural world are quite familiar with As-built drawings.  Those are drawings that reflect how things were really built.  Many times in IT, deployment plans and documentation never go further than the day you deploy it.  OneNote allows for an easy way to turn that deployment plan into a living copy, or as-built configuration of the product you just deployed.  Configurations are as dynamic as the technologies that power them.  Its best to know what sort of monster you created, and how to recreate it if you need to.

 

Daily issues (fire fighting)

Emergencies, impediments, fires, or whatever you’d like to call them, come up all the time.  I’ve found OneNote to be most helpful in two specific areas on this type of task.  I use it as a quick way to gather data on an issue that I can look at later (copying and pasting screenshot and URLs into OneNote), and for comparing the current state of a system against past configurations.  Both ways help me solve the problems more quickly.

Searching text in bitmapped screen captures

One of the really interesting things about OneNote is that you can paste a screen capture of say, a dialog box in the notebook, then when searching later for a keyword, it will include those bitmaps in the search results!!!!  Below is one of the search results OneNote pulled up when I searched for “KDC”  This was a screen capture sitting in OneNote.  Neat.

image

 

Goodbye Browser Bookmarks

How many times have you spent trying to organize your web browser bookmarks or favorites, only to never look at them again, or try to figure out why you bookmarked it?  Its an exercise in futility.  No more!  Toss them all away.  Paste those links into the various locations in OneNote (where the subject matter is applicable, and enter a brief little description on top of it, and you can always find it later when searching for it.

 

Summary

I won’t ever go without using OneNote for projects large or small again.  It is right next to my email as my most used application.  OneNote users tend to be a loyal bunch, and after a few years of using it, I can see why.  At about $80 retail, you can’t go wrong.  And, lucky for you, it will be included in all versions of Office 2010.

Additional Links

New features coming in OneNote 2010
http://blogs.msdn.com/descapa/archive/2009/07/15/overview-of-onenote-2010-what-s-new-for-you.aspx

Using OneNote with SharePoint
http://blogs.msdn.com/mcsnoiwb/archive/2008/12/03/onenote-and-sharepoint-the-basics.aspx 

Interesting tips and tricks with OneNote
http://blogs.msdn.com/onenotetips/