<br><br>
<div><span class="gmail_quote">On 11/5/07, <b class="gmail_sendername">David Krainess</b> <<a href="mailto:davidkrainess@yahoo.com">davidkrainess@yahoo.com</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid"><br>All kidding aside, why...some of the "Data Center"<br>world is moving towards virtualization to maximize
<br>hardware use and consolidate servers. I feel I have 4<br>computers utilizing 5% to 20% of the resources on<br>each. It seems like a waste.</blockquote>
<div> </div>
<div>Except that these data center servers aren't trying to render 1920x1080 MPEG2 on the fly! On my FE, I routinely get 50%+ CPU utilization when watching HD (more if watching + comflagging).</div>
<div> </div>
<div>Also, as a developer that's had to actually *use* virtualized servers, I'll tell you they tend to suck when under load... Some people swear by them, but in my experience, they are rarely pulled off that well (and fer dang sure not on consumer hardware -- the good ones are on SUPER beefy servers that cost way more than 5 myth boxen [esp if they are diskless]).
</div><br> </div>