31 July 2009

War of Graphic [ETC]


http://www.overclock3d.net/gfx/articles/2008/05/30141328120s.jpg

I've been questioned pretty often recently - about which developer has the BEST graphic cards ever produced. Fanboys are getting angry at their rival, users are getting confused when they try to get a decent suggestion. Well, I'm here to make things clear.

It has been a decade of debate. Should I go for ATi? Should I get nVidia? Shut up your nVidia sucks my balls! and so on. Guess that could be pretty annoying, eventually.

As we know, nVidia always comes up with higher price graphic card promised with high performance. But then ATi came up with a very interesting advertisement, promising even higher-than-nVidia performance, with much lower price. So yes, as you would, people started going for ATi. I mean, who would want to spend $500 on graphic card when you can get the same thing from ATi who demands only $200 tribute?

Then people started criticizing nVidia for being so hungry for money. They started saying stuff like "Ooooh look, you get DDR"5" memory on ATi cards, while nVidia's top product has only DDR"3"!!! And it's fucking expensive!!! Fuck you nVidia!!!!"

Stop it.

http://www.firingsquad.com/hardware/nvidia_geforce_gtx_295_early_performance/images/01.jpg
nVidia's top model - GeForce GTX295

= The GDDR"X" =

For those who don't know anything about both sides, they are using DIFFERENT systems. You CANNOT compare GDDR5 published by ATi with GDDR3 from nVidia. Both of them have different bus width (ATi - smaller bus with higher transfer speed; nVidia - wider bus with less speed) but it has the same speed in the end. The 5 and 3 are only used for marketing purpose, to make the products look interesting, vacuuming customers!

= The "CORE CLOCK" =

Again, they use DIFFERENT system - different, you get it? Stop complaining about less core clock labeled on nVidia's boxes, judge the card only when you run it! If nVidia was that slow, slower than ATi, how come they provide reasonably equal fps?

http://techtickerblog.com/wp-content/uploads/2009/04/ati-hd4890.jpg
ATi's top model - HD4890

= The "DirectX and GAMES" =

Look, there are games that run perfectly with ATi (like S.T.A.L.K.E.R), and there are games that fully function with nVidia. Direct X version support seems to be another major factor when it comes to choosing a card. ATi has been promoting their DirectX 10.1 compatibility for a while, when nVidia was left behind with standard DirectX 10.

Let me tell you one thing. IT DOESN'T MATTER. Both DirectX 10 and 10.1 use the same architecture. There might be a slight improvement in 10.1 version, however, it's not a big difference like comparing DirectX 9 and 10, or DirectX 10 and 11 which is believed to be released next year.

So guys, if you are uncertain about your dream card, go observe the benchmark results. Those results give you the most accurate information needed in making a decision. Those results show the real power when both nVidia and ATi perform their tasks. Stop posting questions on any bbs, because, that way, you'll get swarmed by fanboys. And those fanboys are BIASED!

24 July 2009

XFX GTS250 512Mb DDR3 Core Edition [VGA]

http://www.labamanta.lv/cache/product_images/big/20090403/xfx_gf_gts250_pci-e_512mb_ddr3_2xdvi_tv_gs-250x-ydfc.jpg

Alright! Let's start this very first post with my first hardware review. Today we are going to take a close look at XFX GeForce GTS250 512Mb DDR3 Core Edition, freshly arrived...last month - when I decided to upgrade my computer.

Even though I was upset at first, that I should have bought the 1GB version of this card instead, I found this one surprisingly great. The card can run almost every game without a drop in framerate except one game - you know what game I am talking about: the spec killer Crysis.

Now let's have a quick look on GTS250 specification sheet.

- GeForce GTS250 specification (original nVidia clock) -

  • series: GeForce GTX200
  • chipset: G92+
  • interface: PCI-Express 2.0 x16
  • core clock: 738Mhz
  • shader clock: 1836Mhz
  • memory clock: 1100Mhz
  • memory bandwidth: 70.4Gb/sec
  • max power consumption: 150 W
  • memory bus: 256bit
  • directX compatible: directX 10
  • openGL compatible: openGL 2.1
  • shader version: version 4.0
  • SLi: 2 ways SLi supported, 3 ways SLi supported
Surprisingly, this mid-range card has all the features those higher-end video cards have. This card supports DirectX 10, which requires Windows Vista installed on your system to operate, and also runs perfectly with my Core 2 Duo E4500. Who needs $400 video card?

It produces a satisfying framerate which is around 40-60fps in almost every I tried - Vsync enabled. By 30fps you will be playing smoothly, thus, there's no need for explanation with 60fps. Someone said "oh my GTX295 takes me to 120+fps, dude!" But does that really matter? You can hardly tell the differences between 60fps and 120fps when you play. Moreover, many games these days lock their maximum framerate to 60, so why would you need another 60?

Alright, if someone is going to tell me they want to play Crysis at maximum setting without any drop in framerate, then I can't help you. Crysis is actually very good game which, however, is hungry for your money. It will laugh at every graphic card thrown at it unless you have two GTX295 running in Quad-SLi. Whatever. I'm actually tired of people caught up in hype. Many are wasting their money upgrading their computers JUST to play Crysis. I'm not talking about those who love overclocking as a hobby. But seriously, if you want to upgrade your computer just to p

However, it is recommended for gaming. If you are looking for a great graphic card that has great output running at great speed, go buy Quadro series.

http://techwiki.hardwarecanucks.com/productimages/100590_XFX_GTS_250_GS250XYDFC2.jpghttp://i41.tinypic.com/mx29e9.jpg

Now let's look at other specification. The card is 9 inches long and weigh around 2 or 3 pounds. Don't worry, it's not going to tear your motherboard down. As we know, XFX usually uses the same standard heatsink nVidia uses. This one, however, looks a bit different in the series. The fan is still located on the right side, but this time, without XFX sticker. The design looks pretty nice as always. The front side of the card is fully covered in that plastic shell you see, while the back is left opened. However, one little complaint would be the SLi slot which is left opened as well. It could be a problem in the future if you let that part get dusty, you know.

The card has 2 DVI outputs and 1 HDTV output with adapter included in the box. It also requires at least 450W power supply with 1 6-pins connector.

Here are some results of my benchmarking.
  • Resident Evil 5 Benchmark - maximum setting, C16XQ anti-aliasing, Vsync enabled
result: 50fps in average, 60-70 during cutscene, 40-60 during gameplay.
  • Street Fighter IV Benchmark - maximum setting, no anti-aliasing, Vsync enabled
result: 60fps, always :)
  • Mass Effect (fraps) - maximum setting, grain disabled, anisotropic filtering
result: 75fps in average, slightly drop due to cpu usage

No, I'm not talking about 3DMark Vantage - for that program is also another spec killer. If you want to play game to entertain yourself and be happy with it, just forget all these professional benchmark programs, they are for higher-end computer.