Personal thoughts…

Tech, Hockey, and random thoughts…

PhysX add-in card pricing announced

Xbit Labs is reporting that AGEIA has set pricing on its PhysX add-in cards.
The PhysX chip will be manufactured with 0.13 micron technology at TSMC, and the 182sq.mm die will host 125 million transistors

Apparently PCI only initially, and priced between $250-300 US. Retail availability in 4Q 05'.

Frankly, I don't see it selling at all at such a price point. Price it at $100 and you'll get the hardcore gaming enthusiasts. Get it integrated into high end motherboards and you'll get a solid installed base.
Sell it at $250 and only the rich that must have the best at all times will even glance at it.

Active cooling, auxiliary power, 125M transistors on a .13u process… clearly these aren't cheap for them to manufacture.
Too expensive and far too complex to integrate onto graphics cards, or motherboards to get a quick installed base so that developers will have reason to utilize them.
Without a solid installed base software developers aren't going to code for a Physics processing unit, without a decent selection of games supporting it consumers aren't going to buy them… it's an unfortunate catch 22.

It doesn't help them any that in 2006 the market will be flooded with DualCore processors, and developers will be looking towards multithreading. We will buy dualcores for prices of single cores. And what's one of the easiest elements of a game to multithread? Physics calculations.
That extra core will be begging to be utilized and physics are one of the easist elements to code for thread level parallelism.

I wish them luck, but to be blunt Ageia hasn't a prayer in selling many of these and they'll be awfully hard pressed to convince many software developers to actively support, endorse and develop for it.

Advertisements

May 21, 2005 Posted by | Misc. Computer Hardware | Leave a comment

Console Examination: The replies…

To everyone that emailed me about the spelling/grammar in my Console Examination blogs:

Yes, I'm aware the grammar and spelling in my Console Examination 1/2 blogs weren't exactly very good.
I'll blame that on those pieces being written on the spur of the moment after studying the consoles revealed specifications and coming to what conclusions I could based on what we know.

On the plus side, I didn't get any real technical criticisms which was good and the majority seemed to think highly of them 🙂
(besides, it's the first revelation that more then 2 or 3 people are reading this blog, which is nice)

On a side note- evidently I'm not the only one disgusted by GameSpot/IGN's "technical analysis" of the XBox360/PS3.
A nice example of why game sites should NOT attempt to be hardware sites as well.

I can't say as I care much for IGN, but I think very highly of GameSpot. They have some exceptional writers, but game reviewers shouldn't be trying to delve into the hardware if they can do little more then parrot Microsoft and Sony's PR line.I'm all for and applaud their efforts to provide their audience with a serious look at the hardware capabilities, but if they clearly don't have enough knowledge to do any more then regurgitate what Microsoft and Sony's PR claims then they shouldn't parrot their articles as the "final word".

Granted, their employed writing about video games… obviously I'm not.
Still, I'd like to think I'm at least reasonably knowledgeable and I've had a few articles published online and in (a) magazine so for the average laymen I presumably know something at least.

For what amounts to little more then a broad overview, without any real technical depth I'd argue it's a whole lot more informative for most then what IGN/GameSpot's professionals managed.

May 19, 2005 Posted by | Console Examination | Leave a comment

Console examination #2

I've already taken a look at the Cell vs XBox 360 PPC combo, so I figure it's time to examine the graphics side of things.

First the wrapup: The PS3's Cell should be a decent amount more powerful on paper, but it will be incredibly difficult to program for (Though the XBox 360 isn't exactly easy to program for itself).
I suspect the PS3's performance advantages will take a much longer time to come to fruition then the XBox's advantages over the PS2, and it's possible though unlikely the Cell will never be sufficiently adopted enough to truly take full advantage of it's power.

On the whole I'd be looking at the XBox 360's processor to comparable to an Athlon64 X2 4200+ (2X 2.2GHz/512KB Athlon64), on paper at least the Cell should offer greater performance then the Athlon64 X2 4800+ (2x 2.4GHz/1MB Athlon64) though programmer difficulties will slightly nullify that.

Compared to X86 desktop processors we should see high end desktop processors in the Athlon64 X2 available before the Xbox360's launch that can match the respective consoles processors.

As a side note- the Cell processor would with only one truly general purpose processing core will be blown away by the XBox 360 and modern X86 processors on general purpose code but that doesn't really matter on a console.

Now onto the GPU's:
XBox360: Based off the R600 with unified Pixel/Vertex shaders, and 12 pipelines with 4 ALU's per/pipe @ 500MHz it should offer shader performance a decent margin above that of the X850XT PE.
Yet with only 8 Raster Ops (ROPS) it's pixel throughput is only very marginally better then current mainstream PC graphics cards and well below the X850XT PE. The GF 6600GT in fact, can offer stencil fillrate comparable to the XBox GPU, and nVidia's high end cards nearly twice as powerful in that respect.

PS3: nVidia RSX, based on the G70 core.
It would appear then the nVidia RSX is similar to two GF6800Ultra's in SLI in many respects.
Based on the currently available data on performance specifications/300M transistors on an 8layer 90nm process/550MHz core clockspeed I'd say it's most likely that the RSX has 24 pixel rendering pipelines. The number of RasterOps is hard to guess at, but I'm leaning towards saying that the RasterOps are not decoupled from the pixel pipelines.
if the above is accurate the PS3's GPU should offer vastly more pixel/texture fillrate then the XBox360's graphics core. Unlike the Xbox360, the PS3's GPU still uses dedicated pixel/vertex shaders.
It would seem that the RSX should offer slightly more geometry throughput then the regular GeForce 6800 per clock/pipe. On the whole your probably looking at shader performance somewhere around 20% greater then that of a pair of GeForce 6800Ultra's in SLI.

Comparison:
The pixel/texel fillrate advantages of the PS3 should be partially mitigated by the XBox360's 10MB (512bit memory controller) EDRAM that could be used as a cache for the graphics processor. The Xbox360's graphics can access the 512MB of system memory via UMA (Unified Memory Architecture) with a bandwidth of 22.4GB/s.
The PS3's graphics core has 256MB of dedicated memory and can render directly to the 256MB of system memory(25.6GB/s bandwidth) in a similar fashion to nVidia's TurboCache that we've seen on their desktop GeForce 6200TC graphics cards.

On the whole it would seem the PS3 should offer a solid advantage in terms of pixel/texel output at the 480i/720P resolutions I suspect will be the norm for next gen consoles, but I'd suggest the XBox 360 should offer comparable shader/geometry performance.

On the whole, I'd look towards the XBox360's graphics processor to offer performance similar to a pair of 6800Ultra's in SLI while the PS3's graphics processor should be beyond that by a small margin.

Compared to PC graphics cards I'd say we should see high end graphics cards comparable to the PS3's GPU at roughly the same time the XBox360 launches. Upper midrange graphics cards should marginally less then that level of performance a few months after the PS3 launches at which time high end PC graphics cards should be a fair margin beyond it.

May 17, 2005 Posted by | Console Examination | Leave a comment

Console examination #1

The Xbox360 is an odd dichotomy between technically impressive and underwhelming.

3x PowerPC cores at 3.2GHz, each with SMT ensures it's a heavily threaded architecture.
But each core is a heavily cut down version of the G5 with all 3 cores are not capable of OOE (Out of Order execution), and are only dual issue it's IPC isn't going to be very good at all. The G5 in comparison is a 5-way OOO (4 instructions + 1 branch more specifically), peaking currently at 2x 2.7GHz.
Microsoft isn't going to openly state that however, but a little judicious examination of the specifications, and IBM's comments reveals much.

The graphics processor carries is similar…. based off the R600 with unified Pixel/Vertex shaders, and 12 pipelines with 4 ALU's per/pipe @ 500MHz it should offer shader performance a decent margin above that of the X850XT PE.
Yet with only 8 Raster Ops (ROPS) it's pixel throughput is only very marginally better then current mainstream PC graphics cards and well below the X850XT PE. The GF 6600GT in fact, can offer stencil fillrate comparable to the XBox GPU, and nVidia's high end cards nearly twice as powerful in that respect.
With Microsoft pushing High Definition TV's so heavily it seems counterproductive to design the graphics core for ~4000 MP/s fill rate.

I'm betting that the HiDef angle is primarily marketing, and the actual games while HiDef capable probably won't be much better then current non-Hi Def XBox visuals. Most Xbox 360 games will probably still be aimed primarily to those without HiDefinition capable TV's.

Bizarre, in some respects it looks quite nice for a late 05 launch… in others it seems decidedly weak for a console that should be expected to last 5-6yrs.
Long term we shall see out it works out for them.

An interesting route to take. It certainly won't be very programmer friendly, I wonder what the underlying OS is?
The Xbox used a Windows CE derivative on the Coppermine Celeron/PIII dirivative, with the Xbox360 going PPC it certainly won't be any variant of Windows.

With Sony's Cell processor in the Playstation 3 being a massively parallel processor with many small relatively fixed purpose cores buffered by one larger more powerful core it'll be heading the route of many many threads but each thread/core being relatively limited in it's capabilities and comparatively slow. The PS3 will certainly be an extremely difficult console to get good performance out of and optimize code for, even without taking into account how difficult it will be to program for. Still, under the right conditions is could potentially be vastly superior to the XBox.
Unfortunately I don't see the Cell doing well, ironically it's a little too good in some respects… it's too far ahead of it's time. With the Cell built on a 90nm process, each of it's many cores has had to be dumbed down and stripped of most of it's general purpose nature in order to get a reasonably cost effective die size.
Too much, too soon.
Long term, I think Sony has the right idea… but not until lithography tools can handle a 32nm process at the very least. At such a process size the massively parallel route of the Cell may work out very nicely as each individual core could be similar or close in complexity to modern single-threaded processors. and wouldn't need to be so excessively dumbed down as they are in the Cell.
(The Cell implemented in the PS3 appears to be 1x 3.2GHz PPC + 7x SPE @ 3.2GHz)

Nintendo's Revolution of course we know little about it's specifications beyond the fact that it's some sort of PPC derivative, utilizing a ATi graphics.

May 15, 2005 Posted by | Console Examination | Leave a comment

Bullet absorbing spider webs…

The secret to the strength of dragline silk, which probably could absorb the impact of a bullet if it could be woven into clothing, is that its amino acids join into tight crystals, which make the silk's protein fibers stiff and strong.

Damn… now THAT is something humans need to learn how to harness.

(From an article on spider silk, or more specifically that of the Black Widow Spider….http://animal.discovery.com/news/briefs/20050502/spider.html

This post brought to you by the internet, always an amazing resource for fascinating information. 🙂

May 3, 2005 Posted by | History/Nature | Leave a comment