Server Storage I/O Network Benchmark Winter Olympic Games
It is time for the 2014 Winter Olympic games in Sochi Russiawhere competitors including some athletes come together in what has become a mix of sporting and entertainment engaging activities.
Games of inches and seconds, performance and skill
Some of these activities including real Olympic game events are heavier on sports appeal, some with artistic and others pure entertainment with a mix of beauty, braun and maybe even a beast or two. Then there are those events that have been around since the last ice age, while others being post global warming era.
Hence some have been around longer than others showing a mix of old, new in terms of the sports, athletes not to mention technology and their outfits.
I mean how about some of the new snow boarding and things on skis being done, can you image if they brought in as a new "X" sport roller derby on the short speed skating track sponsored by Red Bull or Bud light? Wait, that sounds like the Red Bull Crashed Ice event (check this out if not familiar with) think motto cross, hockey, down hill on ice. How about getting some of the south African long distance sprinters to learn how to speed skate, talk about moving some gold metal as in medals back to the african continent! On the other hand, the current powers to be would lodge protest, change the benchmark or rules to stay in power, hmm, sound familiar with IT?
Ok, enough of the fun stuff (for now), let's get back on track here (catch that pun?).
Metrics that matter, winners and losers
Since these are the Olympics, lets also remember that there still awards for personal and team winners (along with second and third place), after all, if all Olympians were winners, there would be no losers and if no losers, how could there be a winner?
Who or what decides the winners vs. losers involves metrics that matter, something that also applies to servers, storage I/O networking hardware, software and services.
In the case of the Olympics, some of the sports or events are based on speed or how fast (e.g. time) something is done, or how much is accumulated or done in that amount of time while in other events the metrics that matter may be more of a mystery based on judging that maybe subjective.
The technologies to record times, scores, movements and other things that go into scoring have certainly improved, as have the ability for fans to engage and vote their choice, or opposition via social media venues from twitter to face book among others.
What about server storage I/O networking benchmarks
There could easily be an Information Technology (IT) or data infrastructure benchmarking Olympics with events such as faster server (physical, virtual or cloud, personal or consortium team), storage, I/O and networking across hardware, software or services. Of course there would be different approaches favored by the various teams with disputes, protests and other things sometimes seen during Olympic games. One of the challenges however is what would be the metrics that matter particularly to the various marketing groups of each organization or their joint consortium?
Just like with sports, which of the various industry trade groups or consortiums would be the ruling party or voice for a particular event specifying the competition criteria, scoring and other things. What happens when there is a break away group that launches their own competing approach yet when it comes time for the IT benchmarking Olympics, which of the various bodies does the Olympic committee defer to? In case you are not familiar with in sports there are various groups and sub-groups who can decide the participants for various supports perhaps independent of an overall group, sound like IT?
Let the games begin
So then the fun starts, however which of the events are relevant to your needs or interest, sure some are fun or entertaining while others are not practical. Some you can do yourself, while others are just fun to watch, both the thrill of victory and agony of defeat.
This is similar to IT industry benchmarking and specmanship competitions, some of which is more relevant than others, then there are those that are entertaining.
Likewise some benchmarks or workload claims can be reproduced to confirm the results or claims, while others remain more like the results of figure skating judges.
Hence some of the benchmark games are more entertaining, however for those who are not aware or informed, they may turn out to be more misinformation or lead to poor decision-making.
Consequently benchmarks and metrics that matter are those that most closely aging with what your environment is or will be doing.
If your environment is going to be running a particularly simulation or script, than so be it, otoh, look for comparisons that are reflective.
On the other hand, if you can't find something that is applicable, then look at tools and results that have meaning along with relevance, not to mention that provide clarity and repeatable. Being repeatable means that you can get access to the tools, scripts or scenario (preferably free) to run in your own environment.
There is a long list of benchmarks and workload simulation tools, as well as traces available, some for free, some for fee that apply to components, subsystems or complete application systems from server, storage I/O networking applications and hardware. These include those for Email such as Microsoft Exchange related, SQL databases, , LoginVSI for VDI, VMmark for VMware, Hadoop and HDFS related for big data among many others (see more here).
Apples to Apples vs. Apple pie vs. Orange Jello
Something else that matters are apples to apples vs. apples to oranges or worse, apple pie to orange Jello.
This means knowing or gaining insight into the pieces as we as how they behave under different conditions as well as the entire system for a baseline (e.g normal) vs. abnormal.
Hence its winter server storage I/O networking benchmark games with the first event having been earlier this week with team Brocade taking on Cisco. Here is a link to a post by Tony Bourke (@tbourke) that provides some interesting perspectives and interactions, along with a link here to the Brocade sponsored report done by Evaluator Group.
In this match-up, Team Brocade (with HP servers, Brocade switches and an unnamed 16GFC SSD storage system) take on Team Cisco and their UCS (also an un-named 16GFC SSD system that I wonder if Cisco even knows whose's it was?). Ironic that it was almost six years to the date that there was a similar winter benchmark wonder event when NetApp submitted an SPC result for EMC (read more about that cold day here).
The Brocade FC (using HP servers and somebody's SSD storage) vs. Cisco FCoE using UCS (and somebody else's storage) comparison is actually quite entertaining, granted it can also be educational on what to do or not do, focus on or include among others things. The report also raises many questions that seem more wondering why somebody won in an ice figuring skating event vs. the winner of a men's or women's hockey game.
Closing thoughts (for now)
So here’s my last point and perspective, let's have a side of context with them IOPs, TPS, bandwidth and other metrics that matter.
Take metrics and benchmarks with a grain of salt however look for transparency in both how they are produced, information provided and most important, does it matter or is it relevant to your environment or simply entertaining.
Lets see what the next event in the ongoing server storage I/O networking benchmark 2014 winter Olympic games will be.
Some more reading:
SPC and Storage Benchmarking Games
Moving Beyond the Benchmark Brouhaha
More storage and IO metrics that matter
Its US Census time, What about IT Data Centers?
March Metrics and Measuring Social Media (keep in mind that March Madness is just around the corner)
PUE, Are you Managing Power, Energy or Productivity?
How many IOPS can a HDD, HHDD or SSD do?
Part II: How many IOPS can a HDD, HHDD or SSD do with VMware?
You can also take part in the on-going or re-emerging FC vs. FCoE hype and fud events by casting your vote here and see results .
Disclosure: I used to work for Evaluator Group after working for a company called Inrange that competed with, then got absorbed (via CNT and McData) into Brocade who has been a client as has Cisco. I also do performance and functionality testing, audits, validation and proof of concepts services in my own as well as in client labs using various industry standard available tools and techniques. Otoh, not sure that I even need to disclose anything however its easy enough to do so why not ;).
Ok, nuff said (for now)
Cheers gs