Announcement

Collapse
No announcement yet.

Which one?

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Which one?

    Looking to maybe buy another PC...

    Shoudl I buy a Widow WGMI-2X5820A Core i7 Gaming System, SYX Crossfire X58 Gamer, or something you know of thats better and in the same price range?

    I would love something like this one but I just can't go that high $$$ wise...

    BTW the second one above says the vid cards are in crossfire mode.. whats the major difference?

  • #2
    Quoth VenomX View Post
    BTW the second one above says the vid cards are in crossfire mode.. whats the major difference?
    The only difference between SLI and Crossfire are the brand names. SLI is nVidia and Crossfire is ATI. They both work the same (2 video cards, each processing half the image going to your screen, giving you better performance in high-graphical output situations (like high-end gaming).

    As to which one to buy, I'd prefer to go nVidia over ATI, but I'm biased.



    Eric the Grey
    In memory of Dena - Don't Drink and Drive

    Comment


    • #3
      Quoth Eric the Grey View Post
      As to which one to buy, I'd prefer to go nVidia over ATI, but I'm biased.
      You and a good chunk of the industry. Take a look at your games when they load. They'll usually have a "Best on <insert chipset>" Or something similar which indicates what they're optimized for. It's usually NVidia.

      I'm in the NVidia camp myself. I used to go ATi for a while, but kept having problems. Switched to NVidia and smooth sailing since.
      I AM the evil bastard!
      A+ Certified IT Technician

      Comment


      • #4
        Good grief.

        Guess the $1100 Core i7 8GB machine they have at Best Buy is a good deal then.

        Comment


        • #5
          Quoth lordlundar View Post
          You and a good chunk of the industry. Take a look at your games when they load. They'll usually have a "Best on <insert chipset>" Or something similar which indicates what they're optimized for. It's usually NVidia.

          I'm in the NVidia camp myself. I used to go ATi for a while, but kept having problems. Switched to NVidia and smooth sailing since.
          Likewise. I had one Raedon card that refused to keep it's settings on reboot. I would boot up to 800x600 every single time, and have to change the resolution (with a great deal of difficulty because the control panel was too big to find the "OK" buttons and then put my icons all back to where they belonged (I'm anal about my desktop icons being where I want them... (mumble grumble moan bitch)

          Anyway, since then, I avoid ATI whenever possible. Luckily the one I have now hasn't given me any problems.

          My next computer upgrade will have the capability to do SLI.


          Eric the Grey
          In memory of Dena - Don't Drink and Drive

          Comment


          • #6
            Quoth lordlundar View Post
            Take a look at your games when they load. They'll usually have a "Best on <insert chipset>" Or something similar which indicates what they're optimized for. It's usually NVidia.
            Doesn't that just mean NVIDIA paid them to put that on there? :P

            Comment


            • #7
              Quoth Flying Grype View Post
              Doesn't that just mean NVIDIA paid them to put that on there? :P
              It is a marketing deal, yes, but usually how it works is the component maker approaches the game developer/publisher and the deal is that the developer optimizes the game for their chipset and slaps the label on it, and in return, they get a cut of the profits.

              So yeah, it's an advertisement gimmick, but there is optimization in place as well. It's kind of a deal breaker if NVidia finds out that a game with their logo on it works better with an equivalent ATi chipset.
              I AM the evil bastard!
              A+ Certified IT Technician

              Comment


              • #8
                You do have to remember that sometimes, official Nvidia demos have been hacked to work correctly on ATI cards of the same generation - and have run better there.

                This was particularly true of the Geforce FX demos (Dawn and Dusk). The GF-FX was a dog.

                Comment


                • #9
                  Quoth Flying Grype View Post
                  Doesn't that just mean NVIDIA paid them to put that on there? :P
                  I was always under the impression that Nvidia had extra command sets to do slightly funkier stuff, like HDR when it first came out.

                  It's been a while since I compared the two manufacturers seriously but ATI always used to do better on pure DirectX tests whereas benchmarking via gameplay generally meant Nvidia scored higher as they handled the 'own brand' stuff better.

                  As to SLI Vs Crossfire, Crossfire used to be better as it didn't need any special setup, it just looked at the raw picture and chopped it up to split between the GPUs - no idea what the state of play is now as I tend to get a low top-end card (260\280) and not care about the minuscule amount of eye candy I'm missing out on
                  Lady, people aren't chocolates. D'you know what they are mostly? Bastards. Bastard-coated bastards with bastard filling. Dr Cox - Scrubs

                  Comment


                  • #10
                    This is why I got into building my own rigs, prebuilt stuff just always seems to be over priced and never has quite what I would want in it for the money I'm paying. I like the i7 set up you linked. I have one myself however I run two Nvidia GTX 280's for video cards. I have three monitors though so can't run them in SLI unfortunately. I like the specs on that first link though like I said, but then I'm biased toward Intel and Nvidia as well.

                    Comment


                    • #11
                      There was a time when Nvidia automagically replaced shaders from various games with faster but less accurate ones. Needless to say, the hardware-review community and some of the game developers were Not Happy about that, since it artificially inflated the in-game scores while potentially giving the players - the paying customers - lower quality graphics.

                      For much the same reason, Futuremark has some extremely tough rules on what kinds of optimisation can be applied by graphics drivers to their benchmarks.

                      That's not to say that ATI have always been totally blameless, but they learned their lesson early on (around the DX8 era, I think) and have played pretty straight since. If you want over-aggressive optimisations from an ATI driver, you have to turn on the "Advanced Catalyst AI" option by yourself.

                      Comment

                      Working...
                      X