Solarize your Home Lab, and your Home

A notorious trait of vSphere Home Labs is that they start out simple and modest, then evolve into something looking like a small Data Center. As the Home Lab grows in size and sophistication, eventually elements such as power, cooling, and noise can become a problem. IT folks are typically technology geeks at heart, so the first logical step at addressing a problem introduced by one technology is to… well, tackle it with another technology. This post isn’t necessarily about my Home Lab, but how I’ve chosen to power my home where the lab runs. That would be by the use of a residential solar system. A few have asked for me to provide some information on the setup, so here it is.

My interest in solar goes back as far as I can remember. As a young boy I watched my father build four 4’x8′ panels filled with copper tubing to supplement the natural gas furnace providing hot water. It turns out that wasn’t his first adventure with solar. Growing up in the plains of the Midwest during the heart of the Great Depression in the 1930s, he cobbled together what was a crude sort of solar system so his family could have a hot water to a shower outside. I marveled at his ingenuity.

Basics of a modern residential solar system
Residential solar systems typically consist of a collection of panels housing Photovoltaic (PV) cells, connected in series, generating DC current. Each panel has a wattage rating. My panels; 20 in total, are made by Itek Energy, and rated at 280 Watts per panel. Multiplied by 20, this gives a potential to generate 5.6kW in direct sunlight, and optimal orientation. Most PV solar is inherently DC, so this needs to be converted to AC via inverter. Converting DC to AC or vice versa usually has some cost on efficiency. Considering that most electronic devices are DC, and have a transformer of their own, this is a humorous reminder that Thomas Edison and Nikola Tesla are still battling it out after all these years.

Typically solar panels are mounted on a generally south facing side to collect as much direct sunlight as possible. Ideally, the panels would always be perpendicular to the orientation and angle to the sun. With fixed mounting on a roof, this just isn’t going to happen. But fixed mounting in non-ideal situations can still yield good results. For instance, even though my roof has a far from perfect orientation, the results are impressive. An azimuth of 180 degrees would be true South, and considered ideal in the northern hemisphere. My azimuth is 250 degrees (on a 6:12 pitch roof), meaning that it is facing 70 degrees westward from being ideal. However, my 5.6kW solar system peaks out at around 5.2kW, and catches more afternoon light than morning light, often better in areas that may have morning fog or marine air. This less than perfect orientation is estimated to only have a 10% reduction of the total production output of the panels over the course of a year. The 400 Watt shortage from it’s rated maximum is the result of loss from the inverter transitioning it over to AC, as well as some loss to atmospheric conditions.

Sizing of residential solar systems is often the result of the following design requirements and constraints. 1.) How many panels can fit on a roof with ideal orientation. 2.) What is your average electricity usage per day (in kWh), and 3.) What state incentives would make for the ideal size of a system.

The good news is that sizing a system, and estimating the capabilities is far more sophisticated than just guesswork. Any qualified installer will be able to run the numbers for your arrangement and give you full ROI estimates. The National Energy Renewable Laboratory (NREL) has a site that allows you to plug in all of your variables, and also factors in local weather data to provide detailed analysis of a proposed environment.

Grid tied versus Battery backed
Many, if not most residential solar installations these days are grid tied systems. This means that the solar supplements your power from the grid in such a way that the needs of the home will consume the power from the panels, and if there is an overabundance of power generated from the panels, it will feed this back into the grid, and bill your power provider. This is called "net metering" and provides an experience that is seamless to the consumer. One would want to be a bit careful as to not oversize grid tied systems, because some power providers may have caps on net metering and how much they pay you for electricity generated.

A residential solar system may also be battery backed. The benefit to this of course would be full independence from getting power from the grid. However, this introduces capital and operational costs not associated with grid tied systems. The system may have to be sized larger to ensure adequate power on those days where the panels don’t have the ability to generate as much electricity as you hoped for. Battery backed systems may or may not be eligible for some of the subsidies in your area. Grid tied systems prevent the need for one to have this infrastructure, and in many ways, can be thought of as the battery backup to your home when your solar power is not generating enough electricity.

How to know how well it is working
Thanks to modern technology, monitoring solutions can give you full visibility into the performance of your solar panels. My system uses an eGauge Systems Data logger. Since most of my career in IT has involved interpreting graphs and performance data in an attempt to understand systems better, monitoring the system has been one of the more entertaining aspects of the process. One can easily see via Web interface, how much load is being drawn by activities in your home, and how much power is being generated by the solar. The eGauge solution offers quick and easy access to monitoring of the environment via mobile devices, or web browser. Entering in all of your variables will also help it determine how much money you are saving for any given period of time. As the image shows below, it is easy to see how much load the home is consuming (the red fill, how much the solar system is generating (green fill), and how it is either offsetting the load, or feeding back power into the grid system.

Below is a view of a 6 hour window of time. The data is extremely granular; collected and rendered once per second.


The view below is for a 24 hour period. As you can see from the figures, a sunny day in May produces over 35kWh per day


The image below is a view over a one week period. You can certainly see the influence of cloudy days. As one changes the time period, the UI automatically calculates what you are saving (excluding State production incentives)


In case you are curious, my 6 node vSphere home lab is on 24×7, and consumes between 250 and 300 Watts (6.5kWh per day), so that is some of what contributes to the continuous line of red, even when there isn’t much going on in the house.

Economics of Solar
It is an understatement to say that the economics of residential solar varies widely. Geographic location, roof orientation, roof pitch, surface area, weather patterns, federal incentives, state incentives, and electricity rates all play a part in the equation of economic viability. Let’s not forget that much like recycling, or buying a hybrid vehicle, some people do it for emotional reasons as well. In other words, it might make them feel good, regardless if it is a silly financial decision or not. That is not what was driving me, but it would be naive to overlook that this influences people. Incentives typically fall into three categories.

  • Federal incentives. This currently is a 30% rebate at the end of the year on your up-front cost of the entire system.
  • State Incentives. Some States include some form of a production incentive program. This means that for every kWh of energy produced (whether you use it or not), you may receive a payment for the amount produced. This can be at some pre negotiated rate that is quite lucrative. Production incentives in the State of Washington can go as high as 54 cents per kWh, but may have limited terms. State incentives also may include waiving sales tax on all equipment produced in the state.
  • Power provider incentives. This comes in the form of Net metering, and simply charge the power company for every kWh that you produce, but do not use. This is often at a rate equal to what they charge you for power. (e.g. 10 cents per kWh).
    Realistically, the State and power provider incentives are heavily tied to each other, as power companies are a heavily regulated State entity.

Usually it is the State incentives or high power rates in a State are what make solar economically viable. These incentives can make an investment like this have a break-even period that is very reasonable. If there are no State incentives, and you have dirt cheap power from the grid, then it becomes a much tougher sell. This is often where battery backed systems with cheaper Chinese manufactured panels come into play. It is a rapidly changing industry, and depends heavily on legislation in each State. Is solar right for you? It depends on many of the conditions stated above. It’s really best to check with a local installer who can help you determine if it is or not. I used Northwest Wind & Solar to help work through that process, as well as installation of the system.

Observations in production
Now that things have been up and running for a while, there are a few noteworthy observations worth sharing:

  • The actual performance of solar varies widely. Diffused sunlight, or just daylight will certainly generate power, but it may be only 10 % to 20% of potential of the panel. This is one of the reason why power generated can fluctuate so widely.
  • Solar requires a lot of surface area. This was no surprise to me because of my past experience buying small, deck of card sized panels from Radio Shack in my youth. Each of my 20, Itek panels measure out at 3’x5′ per panel and produce 280W in theoretically ideal conditions. Depending on your average daily consumption of energy, you might need between 15 and 40 panels just to accommodate energy utilization rates. Because of this need for a large surface area, incorporating solar into objects such as vehicles is gimmickry at best (yes, I’m talking to you Toyota) and plays into emotions more than it does providing any practical benefit.
  • Monitoring of your power is pretty powerful. Aside from the cool factor of the software that allows you to see how much energy is generated, you also quickly see the realities of some items in your household. Filling a house full of LEDs might reduce your energy consumption and make you feel good along the way, but a few extra loads in of laundry in the dryer, or a bit trigger happy with the A/C unit in your home will quickly offset those savings.
  • Often a crystal clear sunny day does not yield the highest wattage of power generation. The highest peak output comes on partly sunny days. I suspect the reason is that there is less interference in the atmosphere in those partly sunny days. For me, those partly sunny days that may peak the power generation of my system at 5.25kW, will often be only about 4.6kW at its highest on what would be thought of as a crystal clear blue sky day.

Determining whether or not to invest in residential solar is really no different than making a smart design decision in the Data Center. Use data, and not emotions to drive the decision making, then follow that up with real data analysis to determine its success. This approach helps avoid the "trust us, it’s great!" approach found all too often in the IT industry and beyond.

SSDs in the workplace. Trust, but verify

Most of my interests, duties, and responsibilities surround infrastructure.  Virtualization, storage, networking, and the architecture of it all.  Ask me about the latest video card out there, and not only will I probably not know about it, I might not care.  Eventually problems crop up on the desktop though, and the hands have to get dirty.  This happened to me after some issues arose over recently purchased SSD drives.

The Development Team wanted SSDs to see if they could make their high end workstations compile code faster.  Sounded reasonable to me, but the capacity and price point just hadn’t been there until recently.  When it was decided to move forward on the experiment, my shortlist of recommended drives was very short.  I specifically recommended the Crucial M4 line of SSDs.  There are a number of great SSD drives out there, but the M4 has a good reputation, and also sits in my workstation, my laptop, and my NAS in my home lab.  I was quite familiar with the performance numbers they were capable of.

It didn’t take long to learn that that through a series of gaffes, what was ultimately ordered, delivered, and installed on those Developer workstations were not the Crucial M4 SSD drives that have such a good reputation, but the Crucial V4 drives.  The complaints were quite startling.  Code compile times increasing significantly.  In fact, more than doubling over their spinning disk counterparts.  When you have cries to bring back the 7200RPM SATA drives, there must be a problem.   It was time to jump into the fray to see what was up.  The first step was to simply verify that the disks were returning expected results. 

The testing
I used the venerable Iometer for testing, and set it to the following conditions.

  • Single Worker, running for 5 minutes
  • 80000000 Sectors
  • 64 Outstanding I/Os
  • 8KB block size
  • 35% write / 65% read
  • 60% random / 40% sequential
    Each test was run three times, then averaged. Just like putting a car on a Dyno to measure horsepower, the absolute numbers generated was not of tremendous interest to me. Environmental conditions can affect this too much.  I was looking at how these performance numbers related to each other.

For the sake of clarity, I’ve simplified the list of test systems to the following:

  • PC1 = New Dell Precision M6600 laptop.  Includes Crucial M4 SSD and 7200 RPM SATA drive
  • PC2 = Older Dell Precision T3400 workstation.  Includes Crucial V4 SSD and 7200  RPM SATA drive
  • PC3 = New Dell Precision T5500 workstation.  Includes Crucial V4 SSD and 7200 RPM SATA drive
  • VM = A VM in a vSphere 5.0 cluster against an EqualLogic PS6100 array with 7200 RPM SATA drives
    I also tested under different settings (block sizes, etc.), but the results were pretty consistent.  Something was terribly wrong with the Crucial V4 SSDs. Or, they were just something terrible.

The results
Here are the results.

For the first two charts, the higher the number, the better.



For the next two charts, the lower the number, the better



You might be thinking this is an unfair test because they are comparing different systems.  This was done to show it wasn’t one system demonstrating the miserable performance results of the V4.  So, just to pacify curiosity, here are some results of the same tests on a system that had the V4, then was swapped out with the M4.

For the blue numbers, the higher the better.  For the red numbers, the lower the better.


If one looks on the specs between the M4 and V4, there is nothing too unexpected.  Sure, one is SATA II while the other is SATA III.  But the results speak for themselves.  This was not an issue of bus speed.  The performance of the M4 drives were very good; exactly as expected.  The performance of the V4 drives were terrible – far worse than anything I’ve seen out of an SSD.  This isn’t a one off “bad drive” situation either, as there are a bag full of them that perform the same way.  They’ve been tested in brand new workstations, and workstations a few years old.  Again, the same result across the board for all of them.   Surely the V4 is not the only questionable SSD out there.  I’m sure there are some pretty hideous ones lining store shelves everywhere.  I’m sharing this experience to show the disparity between SSDs so that others can make some smart purchasing decisions.

As for the comparison of code compile times, I’ll be saving that for another post.

It was interesting to see how SSDs were so quickly dismissed internally before any real testing had been performed to verify they were actually working okay. Speculation from the group even included them not being a serious option in the workplace.  This false impression was concerning to me, as I knew how much flash is changing the enterprise storage industry.  Sure, SSDs can be weird at times, but the jury is in; SSDs change the game for the better.  Perhaps the disinterest in testing was simply due to this subject not being their area of focus, or they had other things to think about.  Whatever the case, it was certainly a lesson for me in how quickly results can be misinterpreted. 

So if anything, this says to me a few things about SSDs.

  • Check the specific model number, and make sure that it matches the model you desire. 
  • Stick with makes and models that you know.
  • If you’ve never used a particular brand and model of SSD, test it first.  I’m tempted to say, test it no matter what.
  • Stay far far away from the “value” SSDs out there.  It almost appears like solid state thievery.  I can only imagine the number of folks who have under performing drives like the V4, and wonder what all the fuss about SSDs are.  At least with a bad spinning disk, you could tear it apart and make the worlds strongest refrigerator magnet.  Bad SSDs aren’t even good as a paper weight.

– Pete

Using OneNote in IT


It’s hard to believe that as an IT administrator, one of my favorite applications I use is one of the least technical.  Microsoft created an absolutely stellar application when they created OneNote.  If you haven’t used it, you should.

Most IT Administrators have high expectations of themselves.  Somehow we expect to remember pretty much everything.  Deployment planning, research, application specific installation steps and issues.  Information gathering for troubleshooting, and documenting as-built installations.  You might have information that you work with every day, and think “how could I ever forget that?” (you will), along with that obscure, required setting on your old phone system that hasn’t been looked at in years.

The problem is that nobody can remember everything. 

After years of using my share of spiral binders, backs of print outs, and Post-It notes to gather and manage systems and technologies, I’ve realized a few things.  1.)  I can’t read my own writing.  2.)  I never wrote enough down for the information to be valuable.  3.)  What I can’t fit on one physical page, I squeeze in on another page that makes no sense at all.  4.)  The more I have to do, the more I tried (and failed) to figure out a way to file it.  5.)  These notes eventually became meaningless, even though I knew I kept them for a reason.  I just couldn’t remember why.

Do you want to make a huge change in how you work?   Read on.

OneNote was first adopted by our Sales team several years ago, and while I knew what it was, I never bothered to use it for real IT projects until late in 2007, when a colleague of mine (thanks Glenn if you are reading) suggested that it was working well for him and his IT needs.  Ever since then, I wonder how I ever worked without it.

If you aren’t familiar with OneNote, there isn’t too much to understand.  It’s an electronic Notebook. 


It’s arranged just as you’d expect a real notebook.  The left side represents notebooks, the top area of tabs represent sections or earmarks, and the right side represents the pages in a notebook.  It’s that easy.   Just like it’s physical counterpart, it’s free-form formatting allows you to place object anywhere on a page (goodbye MS Word).

What has transpired since my experiment to use OneNote is how well it tackles every single need I have in information gathering and mining of that data after the fact.  Here are some examples.

Long term projects and Research

What better time to try out a new way of working on one of the biggest projects I’ve had to tackle in years, right?  Virtualizing my infrastructure was a huge undertaking, and I had what seemed like an infinite amount of information to learn in a very short period of time, under all different types of subject matters.  In a Notebook called “Virtualization” I had sections that narrowed subject matters down to things like ESX, SAN array, Blades, switchgear, UPS, etc.  Each one of those sections had pages (at least a few dozen for the ESX section, as there was a lot to tackle) that were specific subject matters of information I needed to gather to learn about, or to keep for reference.  Links, screen captures, etc.  I dumped everything in there, including my deployment steps before, during, and after.



Our Linux code compiling machines have very specific package installations and settings that need to be set before deployment.  OneNote works great for this.  The no-brainer checkboxes offer nice clarity.


If you maintain different flavors of Unix or various distributions of Linux, you know how much the syntax can vary.  OneNote helps keep your sanity.  With so many Windows products going the way of Powershell, you’d better have your command line syntax down for that too.

This has also worked well with backend installations.  My Installations of VMware, SharePoint, Exchange, etc. have all been documented this way.  It takes just a bit longer, but is invaluable later on.  Below is a capture of part of my cutover plan from Exchange 2003 to Exchange 2007.


Migrations and Post migration outstanding issues

After big migrations, you have to be on your toes to address issues that are difficult to predict.  OneNote has allowed me to use a simple ISSUE/FIX approach.  So, in an “Apps” notebook, under an “E2007 Migration” section, I might have a page called “Postfix” and it might look something like this.


You can label these pages “Outstanding issues” or as I did for my ESX 3.5 to vSphere migration, “Postfix” pages.



Those in the Engineering/Architectural world are quite familiar with As-built drawings.  Those are drawings that reflect how things were really built.  Many times in IT, deployment plans and documentation never go further than the day you deploy it.  OneNote allows for an easy way to turn that deployment plan into a living copy, or as-built configuration of the product you just deployed.  Configurations are as dynamic as the technologies that power them.  Its best to know what sort of monster you created, and how to recreate it if you need to.


Daily issues (fire fighting)

Emergencies, impediments, fires, or whatever you’d like to call them, come up all the time.  I’ve found OneNote to be most helpful in two specific areas on this type of task.  I use it as a quick way to gather data on an issue that I can look at later (copying and pasting screenshot and URLs into OneNote), and for comparing the current state of a system against past configurations.  Both ways help me solve the problems more quickly.

Searching text in bitmapped screen captures

One of the really interesting things about OneNote is that you can paste a screen capture of say, a dialog box in the notebook, then when searching later for a keyword, it will include those bitmaps in the search results!!!!  Below is one of the search results OneNote pulled up when I searched for “KDC”  This was a screen capture sitting in OneNote.  Neat.



Goodbye Browser Bookmarks

How many times have you spent trying to organize your web browser bookmarks or favorites, only to never look at them again, or try to figure out why you bookmarked it?  Its an exercise in futility.  No more!  Toss them all away.  Paste those links into the various locations in OneNote (where the subject matter is applicable, and enter a brief little description on top of it, and you can always find it later when searching for it.



I won’t ever go without using OneNote for projects large or small again.  It is right next to my email as my most used application.  OneNote users tend to be a loyal bunch, and after a few years of using it, I can see why.  At about $80 retail, you can’t go wrong.  And, lucky for you, it will be included in all versions of Office 2010.

Additional Links

New features coming in OneNote 2010

Using OneNote with SharePoint 

Interesting tips and tricks with OneNote

An introduction of sorts…

There are a thousands of great blogs out there, with extremely smart people contributing all sorts of great information. This may not be one of them. Let me explain.

Every IT Administrator that is a staff of one or two knows that your strength isn’t in knowing every nuance of one particular thing, but rather, the latitude of knowledge needed to make everything work together. The dramatic shift of gears that has to occur in my job on a daily basis is not unique, but no less surprising. From deploying a new Virtualized Infrastructure one day, to figuring out why some SQL buried in our CRM doesn’t work on the next, to getting all of our *nix systems to play with our Windows Systems nicely. It never ends. I used to think that lack of absolute expertise in one specific thing was a hinderence. Now I see it as a strength.

I’ve gotten to stand on the shoulders of many, and would be foolish to think I’ve been able to accomplish everything I have on my own. This includes mentors, colleagues, solution providers, Management teams who trusted my opinion, and those unsung heros who figured out some registry entry that needed to be changed, and chose to write about it, so that I could get some sleep.

So this is to all of those men and women who have the capability setting up multiple VLAN’s, but are finding themselves fixing the photocopier because… well, nobody else can.