well...
I got an upgrade: 8600 GT 512 MG, 2 GB RAM, and well...
On win XP, High lod, full anti-alias... 30+ fps...

First of all, this topic is in NO way to insult neither Microsoft nor Bungie.
And I'm sorry had I hurt anyone.
But I felt I had too.
Oh and I made this quoted post.
[quote]Before you go, Please take a quick look at the following technical details of the Xbox, but I'll short it here: 32-bit 733 MHz Pentium III processor, an almost 1 to 1 identical graphical card to nVida gefore 3, 64 MG of RAM, and so on.
If you'd like to get my point, you may right now skip the technical stuff...
Zitat:
--------------------------------------------------------------------------------
Technical specifications
CPU: 32-bit 733 MHz Pentium III Coppermine-based Mobile Celeron in Micro-PGA2 package. 180 nm process.
SSE floating point SIMD. 4 single-precision floating point numbers per clock cycle.
MMX integer SIMD.
133 MHz 64-bit GTL+ front side bus to GPU.
32 KB L1 cache. 128 KB on-die L2 "Advanced Transfer Cache".
Shared memory subsystem
64 MB DDR SDRAM at 200 MHz; 6.4 GB/s
Supplied by Hynix or Samsung depending on manufacture date and location.
Graphics processing unit (GPU) and system chipset: 233 MHz "NV2A" ASIC. Co-developed by Microsoft and NVIDIA.
4 pixel pipelines with 2 texture units each
932 megapixels/second (233 MHz x 4 pipelines), 1,864 megatexels/second (932 MP x 2 texture units) (peak)
115 million vertices/second, 125 million particles/second (peak)
Peak triangle performance: 29,125,000 32-pixel triangles/sec raw or w. 2 textures and lit.[citation needed]
485,416 triangles per frame at 60fps[citation needed]
970,833 triangles per frame at 30fps[citation needed]
4 textures per pass, texture compression, full scene anti-aliasing (NV Quincunx, supersampling, multisampling)
Bilinear, trilinear, and anisotropic texture filtering
Similar to the GeForce 3 and GeForce 4 PC GPUs.
Storage media
2x