How did you cram a full-size GPU into such a small case? PCIe risers?
Thanks for the answer at least. :)
Can you control colors in the BIOS? Some GNU/Linux fans like me would like to know about this.
I know this is late, but an FX core is practically indeed 1/2 an Intel core. Of course reality is more complex, with HyperThreading in Intel and individual integer computation in AMD, but yeah.
Ryzen is looking great though, sadly stupid kinks seem to affect gaming, even if single-core performance is still great...
I guess somebody now needs to test it with Arma3.
I know that the PS4 and Xbone have APUs, but they're the most powerful APUs AMD has made. The PS4's is an 8 core 1.6 GHz with an slightly downgraded integrated r9 270 (or a plain r9 265), but the GPU is heavily bottlenecked by the CPU, making it perform more like a GTX 750 Ti. The Xbone has 1.75 GHz per core, and somewhere between R7 250 and R7 260x as the GPU. They're weak, but not THAT weak.
Ok I messed up on my last statement. But the fact that PS4 doesn't run DX 11 is still true. Search it up.
It says this: “We use DX11.1, there are some optimizations in it (constant buffer offsets, dynamic buffers as SRVs) that we got in to the the API that improves CPU performance in our rendering when one runs with DX11.1. This will be in BF4.”
"This means that PC and Xbox One versions of Battlefield 4 would be more optimized than PlayStation 4."
Why would the DX 11.1 version be any more optimized than PS4? Because PS4 doesn't use DX 11 with BF4. Plus everyone in the thread was bashing the PS4 BECAUSE it DIDN'T had DX 11. That still proves my point. Plus even DICE and OP mentioned PS4 had no DX 11.
Plus the DX 11 on Xbox One and PSSL and OpenGL on PS4 is different from the DX 11 and OpenGL on PC. They are more like the simplified version of DX 12 that's coming for GPUs that aren't the newest GPUs out there, like Nvidia's 1st generation Maxwell, Kepler, and below, or many of AMD's GCN GPUs (I think.) They still push to the metal of the consoles' hardware. DX 11 and OpenGL 4 don't push to the metal of PC hardware on the other hand. But that's why DX 12 and Vulkan exist. ;)
Except for some games. Look up Digital Foundry's "Can a budget gaming PC match console?" on YouTube. You'll see that while the i3 (featured later in the video) can match (and beat at times) console, but the Pentium chokes on a few games, like Metal Gear Solid 5, AC: Unity, Ryse, and a few others, and that the PS4 runs better. Why? The Pentium has 2 threads, while the i3 has 4, and the PS4 has 8 threads, and the Pentium as a result due to few threads, choke on a few games. The i3 is at a perfect budget balance, though, because it has 4 somewhat strong threads, not too many weak threads or few strong threads. Not saying he picked a terrible CPU, and it'll fit his purposes (playing MC and Kerbal) perfectly. But it ain't the console killer or "beast" everyone keeps making it out to be.
Proof? DICE doesn't own DX 11 and DX 11 has never been ported anywhere outside Xbone and PC. -_-
Don't lie to me about what you said, you just said this in an earlier comment:
"Nope, Battlefield 4 on PS4 used DX11 when it was meant to use PSSL."
Yes, the Nvidia GPU in the PS3 was powerful, I was trying to say it was weaker than the 360's, sorry for saying it really wrong, like as if in its own right it's weak, and not really when compared to the 360's GPU.
Anyways, I said you were misinformed because Playstation systems don't use Direct X of any kind, Direct X is MS-only. The reason people mentioned DX 11 when it comes to the PS4 was because the GPU was DX 11 capable, not that the PS4 actually uses DX 11. The Orbis OS (PS4's OS) is based off of FreeBSD 9, a Unix-like OS. No Unix-like or UNIX OS in the history of the world has ever used Direct X, unless they used Wine, a bunch of APIs to have Unix-like and UNIX OSes (Linux, FreeBSD, Mac OS X, etc.) that is used to make the OS pretend it's like Windows to run Windows programs. The PS4 either uses a form of OpenGL or its own APIs. No Direct X is involved.
No, they don't, historically devs focused on console more, remember Watch_Dogs and Witcher 3's downgrade situation, where max graphics of the PC version were downgraded to look more like console graphics (yes Witcher 3's higher settings are still better than console, but not by much), and remember games like NFS: Rivals and AC: Unity where they're either capped at 30 FPS (NFS: Rivals) or just runs bad in general (AC: Unity, heck high end systems have trouble getting 60 FPS in the stupid game)? Consoles traditionally got the better version because they were easier to develop for, heck an equivalent gaming PC in 2005 or '06 when the PS3 and Xbox 360 came in was way more expensive, and overtime was beaten by the optimization of the 360 and PS3. It just so happens this time around an i3+750 Ti beats PS4, without using Vulkan, just plain old DX 11.
As for your ramblings of the APIs on PS4, first off the PS4 DOESN'T use DX 11. Direct X is ONLY for Microsoft's platforms, which are Windows and Xbox. It's not on Wii U, PS3, PS4, Linux, Mac, and every other NON-Microsoft platform. That itself shows how misinformed you are. Also, has Naughty Dog said that Last of Us Remastered doesn't use half of the CPU in the PS4? One other point is that Cell also had to process part of the graphics for the game as well, because the Nvidia GPU in the PS3 was too weak itself to handle the game.
Edit: Didn't know Crysis 3's AI was bad, but well AIs are usually terrible anyways. :P
Multiplatform devs aren't targeting the 750 ti either, they're targeting the Xbox One due to its popularity in the States and that it's the lowest common denominator of gaming performance right now. And what makes you think the PS4 has some "hidden power?" This isn't like previous generations where the consoles were PowerPC based or based of some other weird architecture, it's based off of the x86_64 architecture, which is getting it's full potential reached by stuff like Vulkan and DX 12 in the future. And the consoles already are being programmed to the metal like those API will do to PCs. Yet in almost every game, a PS4 (which is being pushed to the metal) is being outperformed or matched by the 750 Ti + i3, and these advanced APIs aren't even launched yet, PCs are still usually using DX 11, DX 9, OpenGL 3, or OpenGL 4. Plus I think The Order:1886 and Crysis 3 look very similar, texture-wise. Yes The Order has slightly more better character graphics and slightly better effects, but Crysis 3 makes it up with bigger environments, better AI, better water effects (I don't even know if The Order has water), slightly better shadows, and better vegetation. Yes, potentially the PS4 GPU is better, with 256-bit memory, more cores, more TFlops, but it is HEAVILY BOTTLENECKED by the mobile low-power 8-core 1.6 GHz Jaguar CPU, how many times I need to tell you that?
I know, in fact I hate games like Unity, but I wanted to mention the PS4 isn't so powerful at times, even if the game was a broken mess. As for the Order, it still ran at a lower resolution and at 24 or 30 FPS, plus in Crysis 3, the 750 Ti can achieve a 23-25 FPS at 1080p on Ultra, and at 900p probably 30 FPS, though the 900p one a bit of an educated guess. I have yet to get a 750 Ti and Crysis 3 myself to prove that guess though. And Crysis 3 is just as pretty (if not more) than The Order: 1886. That said, Crysis 3 also was more bigger than The Order: 1886. My point being is, the 750 Ti can rival the PS4, especially with an Athlon X4 860k (which itself is equal to the A10-7850k without integrated graphics), an FX-6300, or an i3.
"The only reason 750Ti beat PS4 is because developers are lazy." If that were true, then why is the 750 Ti, which is way worse spec-wise, perform as good (or better) than the PS4? The PS4's GPU is more like an R7 265 or 270, which are superior to the 750 Ti, yet the 750 Ti performs better than the PS4. You also got to know the IPC (Instructions per clock) in the PS4 is as bad as the Intel Atoms and 8-core ARM CPUs, due to being based off of a mobile CPU series called Jaguar. Its IPC is probably worse than the FX chips as well. It's a bad CPU. It's more like the PS4 is being bottlenecked by the horrible CPU. Heck, it's kinda the reason why the PS4 ran AC: Unity horribly.
The card will be ok then.
Don't expect to do a lot of gaming on the card, though. It's mainly good to getting you an actual GPU to use on Linux, not to really game. Just warning you about that. It might do as well as integrated intel graphics usually, but in most cases, that's ok enough.
Also, get the proprietary drivers, the OSS driver is pretty terrible.
It's more than that, in fact there's quite a few games where because of the fact the Pentium is dual core, it suffered. Ryse: Son of Rome, Battlefield 4 Multiplayer, Crysis 3 (slightly), Assassins' Creed Unity, Farcry 4, and Dragon Age: Inquisition. An i3 (it's dual core, but it has HT) or a Athlon 860k would've done better.
Talk about censorship.
Plus can't an Athlon 860k compete against the PS4's CPU, it has better multi-core performance than the i3. And I saw videos (especially Digital Foundry's comparision videos) where a 750 Ti + i3-4130 nearly always beaten a PS4 on equivalent settings. Plus this 750 Ti is pre-overclocked, making the times the PS4 caught up moot.
I know the Pentium is terrible and overrated, though.
Man that's cheap, hopefully that'll help you with your GPU situation. Just make sure you delete any proprietary AMD drivers and use the ppa to get the Nvidia drivers.
It's not free in the same way, only the Technical preview is free, the only other way to get it "free" is with an existing license of Win 7 or Win 8/8.1.
the evga one.
Maybe due to upgradability. :P
Well, that's stupid.
Go into a terminal and type these when you get a Nvidia GPU
$ sudo add-apt-repository ppa:xorg-edgers/ppa -y
$ sudo apt-get update
$ sudo apt-get install nvidia-current
There, that's how you get drivers for Nvidia GPUs. Be sure to purge any proprietary AMD drivers as well, too. Plus, what happened that made you get a ticket? :/
I know, but for short term. Getting an Intel will save me some cost when I probably want to upgrade to a higher end CPU someday. Plus older titles. :P
True. I'd say a r7 250 would be more accurate Though.
I just use VMs to goof around in alternate OS's that are not Ubuntu or Windows. But I'll keep in mind about this.
Well there was one video where someone got over 50 FPS on high(I think) on a overclocked Pentium g3258 with a 750 ti FTW. That said I'm probably going to do the i3 idea, since I can upgrade to a broadwell i5 in a while into the future, will prevent bottlenecking in PS2 (when I get Windows or use Wine, and even my Ivy bridge laptop i5 dual core bottlenecks the iGPU to 40 FPS), and I can hold out until I can get a GTX 1060 or whatever the Pascal equivalent to a GTX 960 will be, I don't have great needs and little-better-than-PS4 performance will be fine in most cases for me, my main reason for PC gaming is the freedom and unique titles. Might reconsider if the gtx 960 with 860k can achieve 1080p@60FPS though on ps4-ish settings, though.
They're terrible at gaming, and the proprietary drivers, while are better at games, are unstable as heck. Plus AMD has bad OpenGL implementations. That's why I'm going Nvidia.
My intentions are for gaming, but I might do some VMs and programming as well, which are other reasons to getting me an i3. Plus, less-threaded stuff like Planetside 2 and Minecraft.
Everybody, it seems like the Athlon idea is better, but I got another question, will the Athlon bottleneck the GTX 960? If it does, I will go the 750 ti route due to the upgradability of Intel, and get a better GPU later into the future, like when Pascal comes along, or when I simply have enough cash. Plus, I'm not just going to be gaming. I might do some programming and stuff like that. :P
Edit: Plus I looked up GTA V on an athlon with gtx 960 and it ran the game the same as an overclocked 750 Ti with a Pentium.
EDIT2: Plus Minecraft. :P
Can't, will be doing Linux.
As I said, you can overclock the 750 Ti FTW to where it beats the R7 265 (or matches) and can be close to the R9 270 (though a bit behind still), and you get the bonus to cheap out on the OS by picking Linux, with AMD you can't do that due to poor Linux support, not saying that OP's going to run it though. But if you can afford more for Windows, or you can afford the GTX 960, then go for the other options.
It's ok, as long as you hunt for a good deal on the GPU, if you find one for $125-$135, it's a pretty good deal in comparison to buying at the usual $150-$160 price range. If you get in that terrible range, just drop the 750 ti and find something better. Also try to find one like the FTW, so you can overclock it to death.
Also Pentium runs less games, don't forget that, try playing some recent AAAs on a Pentium. I don't get everyone's love of that chip, even Athlons do better, at least in compatibility, not FPS.
I've learn't it the hard way too, had Ubuntu on a E1-1200 AMD APU and it was terrible. TF2 was barely playable, with either proprietary or OSS drivers, made me try to plan my new build with a 750 Ti instead. BTW, I recommend getting the 750 ti for that build, it's cheap, and should beat the next-gen consoles. The GT 730 is too weak, but if you can't afford the 750 ti, it's still an ok GPU.
Wish AMD gets their act together, BTW
Besides the integrated Radeon being terrible, Linux has bad AMD GPU support because AMD cannot make a proper driver for Linux, while the OSS AMD driver included has some pretty bad optimization. Nvidia's proprietary drivers are way better. You should try hunting for a Nvidia, GTX 750 and 750 Ti are great budget Nvidia GPUs. If you cannot afford them, their GT (below GTX) are OK, too, though don't expect much out of the GT series.
It said x4, but there was also a "II" at the end of athlon for some reason, this is a typo.
Ubuntu can now use Netflix through Chrome.
I know this is very late to tell you of this though....
I've been hearing that's only related to integrated graphics, not GPUs like the 980.