The Amazing Spider-Man game is absoutely great … but someone didn’t spend enough time on the quality assurance after the game has been released it seems. It stutters on AMD graphic cards with recent drivers. I’m not entiery sure if it’s maybe not the fault of AMD who didn’t add an optimzed profile for the game but you can easliy fix it by fooling the game in thinking you have an Nvidia graphics card! Check out a detailed instruction here: http://steamcommunity.com/app/212580/discussions/0/828925849355197850/#c828925849477117777
The problem with this workaround is that you have to delete the fake dll file each time and add it after running the launcher in the folder so I hope AMD or Beenox is going to fix it in a driver update of patch!
Also the developers didn’t add any proper anti-alias or it simply fails to work especially on indoor parts of the game. On a console it’s not that importat as you sit far away from your TV set but PC monitors are usually relatively close to your eyes. Turning on morphological anti-alias (http://sites.amd.com/us/game/technology/Pages/morphological-aa.aspx) helps a bit but not much. I guess you can really fix that it by using down-sampling anti-alias tools like this SSAA-Tool: http://www.tommti-systems.de/start.html
First of all I want to mention that I’m a huge AMD fan! My last Intel CPU was a Pentium 120Mhz. Later I had Athlons of various generations and now a Phenom II 920 running @ 3,2 GHz. My last Nvidia graphics card was a GeForce 6800 that I killed with overclocking. Later I had ATI cards like the X850, 4850 and now an AMD 6870 with 1GB. I love my AMD system as it’s really giving a lot of bang-for-the-buck meaning that is good value for the money.
The video above shows Rage with the latest AMD 11.10 Catalyst WHQL drivers. This is now the 4th driver AMD released that is supposed to fix problems with id Software’s Rage. The last version 11.9 had problems with texture streaming that have been fixed later by AMD in the 11.10 preview 2 & 3 drivers. The first preview driver made Rage run faster but it had the texture popping shown in the video below. The 11.10 final version totally broke it and we have Smurf-like blue textures!
Nvidia had this problem too (just to be fair) but they managed to fix it in a driver update. AMD totally failed to do that and even made it worse with the blue texture bug! It even looks to me this might be some kind of revenge by AMD because id Software and Bethesda blamed AMD for the initial texture popping problem. Maybe it’s even because id Software cooperated with Intel on making Rage run on their Sandy Bridge CPU-integrated HD 3000 GPU. AMD looks bad here… and they should change that! It should be like in this commercial: left and right should be the same image. Currently with AMD and Intel/Nvidia on Rage it’s not the same picture even if the price is lower on the AMD side
AMD, you don’t want me to like the “AMD SUCKS” site on Facebook don’t you? It’s only one click away at http://www.facebook.com/pages/AMD-SUCKS/301628902631
If you want to solve this problem manually there is a community-driven solution for the blue texture problem: http://forums.bethsoft.com/index.php?/topic/1249144-ati-1110-final-drivers-fix/
Edit: The Catalyst 11.11 performance driver finally fixes all the problems!
Mafia II is an awesome game, there is no doubt about that but it supports only Nvidia for the physics. That’s sucks big time! OpenCL is a hardware acceleration standard supported by both Nvidia and ATI. Awesome… but 2K Games didn’t implement it. Instead they went for Nvidia only PhysX support. I’m not sure how much Nvidia paid them but that’s really stupid. You will get Mafia II even bundled with 4xxNvidia cards so it’s clear there is some kind of deal: http://news.bigdownload.com/2010/08/28/get-mafia-ii-free-with-new-nvidia-graphics-card-purchase/
1. Such “It’s meant to be played with Nvidia” logos didn’t make me more loyal to Nvidia when I had Nvidia graphics cards.
2. Now that I switched to ATI (won’t never go back to Nvidia BTW) these logos that come up when starting the game really piss me off.
3. OpenGL/OpenCL, DirectX/DirectCompute are standards that are supposed to make life easier for gamers, game publishers and hardware manufacturers. Nvidia-only-PhysX is the opposite.
On the right you see the experience for ATI-owners with PhysX off (running on the CPU results in ~10fps depending on the machine)
So in general Mafia II is really worth playing even for ATI users but game publishers should stop putting those “It’s meant to be played with Nvidia” logos in their games because they might just annoy too many people having “the other” graphics card. Review@Gamespot:
More about the “It’s meant to be played” program: ttp://www.nzone.com/object/nzone_twimtbp_gameslist.html
So this is OpenCL … “the way to go” FTW:
Other comments on this topic: http://ve3d.ign.com/articles/news/53989/Mafia-II-PC-PhysX-Trailer
At work I have a 24″ screen and I consider getting one for home. Maybe a HannsG or Samsung. The problem: 1920×1080 is ~60% square pixels more than 1280×1024 (what I have now) and this higher resolution needs more VRAM. I have only 512MB and running that with my current graphics card would result in lower fps. So soon it will be time for a new graphics card for me (I have a XFX 4830 right now). What would fit my needs is either an overclocked 5770 like the PowerColor HD5770 PCS+ VORTEX or a standard 5850… but somehow I “smell” ATI releasing the 6000 series soon. The 4xx Nvidia series is cheaper and more powerful than comparable ATI cards so ATI/AMD has to come up with something new. There is some speculation going on at Tom’s Hardware:
Let’s see how fast this is going to happen… and yea, with 1GB VRAM I will be able to play GTA IV finally with 100% quality 😉
EDIT (Oct 3rd 2010) : Now the AMD Radeon HD 6770 and 6750 Video Card Specs Leaked!
EDIT (Oct 22nd 2010): The new 6000 generation is finally out the door!
this is how they look like (the picture on the top is a 5770 and no 6xxx BTW)
Finally I ordered the parts for my new PC! A very reasonable configuration – best value IMHO – based on the AMD Dragon platform. I’m going to put them in the case of an old Medion P4 that I got from a colleague. Will post more soon!
Infos about the AMD Dragon platform: http://game.amd.com/us-en/landings/dragon.aspx
BTW: I ordered at HPM Computer, a recommendation from the PCGH.de forum – Infos: http://www.hpm-computer.de/
I want to get a AMD Phenom II system soon and I checked some graphics card benchmarks to find a card for the new machine that would bring both performance and value. Seems the ATI Radeon HD 4830 is a good choice in the 100€ segment (http://www.hpm-computer.de/product_info.php?info=p9565_Sapphire-Radeon-HD-4830-Dual-Slot-512MB.html).
Usually when you check grahics card benchmarks you don’t see any old hardware. I spotted a very nice benchmark at pcgh.de where they tested 6 generations of graphics cards and… viola! The one that I have currently (ATI X850) is still on the list. LOL 😀 I really can confirm such frame rates in Call of Duty 4: Modern Warfare! It’s great! COD4 has a very performant engine that runs perfectly on older ATI cards like the X850.
Full artice (German): http://www.pcgameshardware.de/aid,674689/Test/Benchmark/Sechs_Grafikkarten-Generationen_im_Benchmark-Test/?page=1
There are many great things that can make your PC faster… but wait. There is a second powerful processor in your machine already that you might not be aware of. It’s the GPU of your graphics card. ATI & Nvidia are going the right way. There are so many things the GPU can do. I’m encoding quite a lot of videos lately. Just check the chart above. Lower is better and notice the price of the AMD platform. This is what I call vaule. Physics or Artificial Intelligence in games – why do they have to use one of the cores of a Multi-Core-CPU? The GPU can provide the rendering power for that.
Q: Why are reviewers seeing so little GPU processing during transcoding?
A: The ATI Video Converter uses the GPU for only a portion of the video encoding. Specifically, the GPU currently offloads only the motion estimation portion of the encoding pipeline which is the most compute intensive part of the encoder. The GPU is particularly well-suited for this task. Given that a fixed workload is being offloaded to the GPU the load on the GPU may be relatively low or high based on the specific model of GPU.
Seems that ATI’s Stream got a little bit in bad light becasue it’s using only little GPU processing during transcoding but I guess that problems will be solved soon: