This is a little late...
Fairlight
fairlite at fairlite.com
Tue Oct 26 14:07:38 PDT 2010
On Tue, Oct 26, 2010 at 01:30:31PM -0400, John Esak may or may not have
proven themselves an utter git by pronouncing:
>
> Anyway, my bud at Tim's place just tells me that he has finished a new
> gaming machine in his basement. It's got the new super core I-whatever it
> is... Even more amazing DDL memory... With SATA3 drives, and I am pretty
> sure he said USB-3 as well? Haven't even checked to see if that is a real
> thing, or did I just ranatasize it? The thing that pisses me off is the
> SATA3 and the newer more ridiculous chip. All in the space of what for me
> was 6 months not even a year.
Probably the i7 chip. Those are drool-worthy, seriously. Didn't know
SATA3 was out. USB3 is real, and very new in terms of actual hardware
implementation. From the Wiki entry:
USB 3.0
The USB 3.0 Promoter Group announced on November 17, 2008, that version 3.0
of the specification had been completed and had made the transition to the
USB Implementers Forum (USB-IF), the managing body of USB
specifications.[65] This move effectively opened the specification to
hardware developers for implementation in future products. The first
certified USB 3.0 consumer products were announced January 5, 2010, at the
Las Vegas Consumer Electronics Show (CES), including two motherboards by
ASUS and Gigabyte Technology.[66][67]
> It is all true to form to that syndrome I've described before where all the
> punk kids are around the Executive Board table of the various huge hi-tech
> companies... And someone pipes up... "Okay, Esak just bought the lates CPU
> and peripheral stuff... We can release the newer series now!" They all
> applaude for having screwed me yet again... :-)
Just think of it like cars. The second you take it over the curb of the
dealership lot, it's just depreciated several grand in value. Computers,
by the time it makes it to market, it's halfway to obsolete already, since
the next version is already in the pipeline. Take Intel CPUs. They'll
release on 11-month cycles. Every other cycle, they make a huge
performance increase, and the next cycle they make some modest increases in
performance, but they'll shrink the die size so they're cooler and more
power efficient. At least that's how it was through the E8400 that I have,
which is a 45nm cooler chip. But the i7 started off as a 45nm die. I'm
not sure if there's a smaller size now or if they're just standardising on
that from here on out.
The reality is, unless you have money just falling out of your pockets,
your average system life is 3-8 years, averaging a planned five year
lifetime. That's essentially five generations of chips you'll be passing
up, unless they're socket-compatible. Graphics chips change even faster
than that. I got an 8800, and a few months later the 9800 was out
(although the reality is that the 9800 had -exactly- the same performance
specs for pure 3D rendering, it just had HDTV capabilities). But then
under a year later, the GTX 2xx series was out, and now they're up to the
GTX 4xx series. And these are -not- cheap. The GTX 295 was still $600+
when I went to update my video card, so I went with a $340 GTX 275 Co-op
card with a GTS 250 (which is a rebranded 9800GTX) as a dedicated PhysX
processor and extra RAM. I just could not afford to keep up with the
Joneses, much as I'd have liked to.
> Actually, with audio (and probably video) it is critical that even if a new
> series of CPU's motherboards, etc., are brand new and working... To find out
> if *all* of the drivers for your most favorite hardware are upgraded for it
> yet. You usually end up sitting around with the thing in 32bit mode rather
> than 64, and using only this scheme instead of that one, until all the third
> party driver writers get their act together. (Hey doesn't "driver writers
> have a nice flow to it sor t of like m*ther f*ckers... Um, er, well you get
> the idea what I think about them :-) the ever spiraling up technology
> causing us all to re-buy every design cycle is going to destroy various good
> things in our culture. It probably keeps us moving ahead nicely into the
> future... But those things we are losing... Some of them are importatnt.)
Speaking of sound...
Tellya something... Onboard integrated sound has made great leaps in the
last decade. My DP35DP mainboard has -great- integrated sound, including
full Dolby 7.1 support if I had the speaker setup to support it. BUT...
You pay for the integrated sound in terms of extra CPU usage at the kernel
level. Next rig, dedicated sound card if I can afford it. Some things run
a little close to the red line on CPU usage, and that little extra that
sound takes can cause some stutters and hiccups during "busy" times with
lots of sound, especially at higher quality sample rates, etc.
And your implication of the term "driver writers" would apply well to the
team at ATI that designs that -horrid- Catalyst driver system. That -used-
to be one process. Nowadays, last time I used it, it was THREE processes,
and it's horribly bloated. Dumping ATI is one of the best things I've done
in terms of hardware, specifically because their -software- used to drive
it is so awful, and they also drop support for operating systems half a
decade before Microsoft does. Meanwhile, NVidia drivers still support all
the way back to Win95. That says something important to me.
mark->
More information about the Filepro-list
mailing list