Hey guys!
So I have a bit of a problem that I can't get around in my head. The problem is as follows: when next gen consoles come out, the games will not seem any good because the game makers have not had the devkit there to design the game with. But, give it a year and the consoles will be well away with game quality due to improved optimization and, of course, the new architecture.
As I understand, the new consoles will have AMD 8 cores in them. Now this is interesting to me because that is nothing to shout about. The eight cores that we have in PCs at the time of writing this are either not as good as they are hyped up to be and/or are not physical 8 core CPUs. But, if what I predict is correct, should I get an AMD 8 core? The thing is, I think that with the new, weak 8 cores on the consoles, game devs will start to design games that are more reliant on multiple cores (so that the load is spread out) and not on strong quad cores, which are best at single core performance. Intel basically owns the market at the moment where performance is concerned and that is down to the shear single core performance the i5s and i7s can bring to the table. the only competitor for the red team is the 8350 And maybe the new, stupidly priced FX 9xxx CPUs. So, my first question is: What is a safe processor to go for?
No I know that the question I just asked would seem pretty obvious to most of you but we do not know what the new consoles and the new architecture have to offer. So, for that reason and the fact that the games wont be optimised straight away, would Intel be a good path too?
Now, onto the GPU.
One thing that stands out to me is the fact the GPU on the system shares memory with the rest of the system. is this really necessary? Lets say, for example, that the operating system takes up 1GB of that as standard, and lets say that there is a further one GB reserved in case of any background apps needing that memory to grab. Now, Unless I am going crazy, that leaves 6GB of memory to its disposal. I have a two GB card in my system and I barely use 1.5 of that and that is with most of the settings turned all the way up.
Does this mean that the consoles will be able to run at higher resolutions? Have insanely high-res textures? I am worried for this as I have just splashed out on a new GPU and I don't want it to be rendered useless before its life cycle is even up. I know that sounds silly but what is the point settling for low settings on a GPU that should be able to run High-Ultra on most games? And my GPU is and ATI 7870.
So, my last question is, and I am not expecting you guys to know for certain, I would just like educated guesses on both what I have told you and on the article I have linked in the bottom of this post: Do you guys think that we might see some GPUs with huge amounts of VRAM on them in the near future? And I know there is such thing as a Titan but at £700+? Jog on.
And another thing, most of what I have mentioned in the post about system specs is all based on things that I have read in other articles. It is not certain that these are the actual specs of the systems, But I am pretty sure if both companies have released these, then that is something solid to go by
Thanks for your input if you have made it this far You deserve a good pat on the back.
Article: http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen
So I have a bit of a problem that I can't get around in my head. The problem is as follows: when next gen consoles come out, the games will not seem any good because the game makers have not had the devkit there to design the game with. But, give it a year and the consoles will be well away with game quality due to improved optimization and, of course, the new architecture.
As I understand, the new consoles will have AMD 8 cores in them. Now this is interesting to me because that is nothing to shout about. The eight cores that we have in PCs at the time of writing this are either not as good as they are hyped up to be and/or are not physical 8 core CPUs. But, if what I predict is correct, should I get an AMD 8 core? The thing is, I think that with the new, weak 8 cores on the consoles, game devs will start to design games that are more reliant on multiple cores (so that the load is spread out) and not on strong quad cores, which are best at single core performance. Intel basically owns the market at the moment where performance is concerned and that is down to the shear single core performance the i5s and i7s can bring to the table. the only competitor for the red team is the 8350 And maybe the new, stupidly priced FX 9xxx CPUs. So, my first question is: What is a safe processor to go for?
No I know that the question I just asked would seem pretty obvious to most of you but we do not know what the new consoles and the new architecture have to offer. So, for that reason and the fact that the games wont be optimised straight away, would Intel be a good path too?
Now, onto the GPU.
One thing that stands out to me is the fact the GPU on the system shares memory with the rest of the system. is this really necessary? Lets say, for example, that the operating system takes up 1GB of that as standard, and lets say that there is a further one GB reserved in case of any background apps needing that memory to grab. Now, Unless I am going crazy, that leaves 6GB of memory to its disposal. I have a two GB card in my system and I barely use 1.5 of that and that is with most of the settings turned all the way up.
Does this mean that the consoles will be able to run at higher resolutions? Have insanely high-res textures? I am worried for this as I have just splashed out on a new GPU and I don't want it to be rendered useless before its life cycle is even up. I know that sounds silly but what is the point settling for low settings on a GPU that should be able to run High-Ultra on most games? And my GPU is and ATI 7870.
So, my last question is, and I am not expecting you guys to know for certain, I would just like educated guesses on both what I have told you and on the article I have linked in the bottom of this post: Do you guys think that we might see some GPUs with huge amounts of VRAM on them in the near future? And I know there is such thing as a Titan but at £700+? Jog on.
And another thing, most of what I have mentioned in the post about system specs is all based on things that I have read in other articles. It is not certain that these are the actual specs of the systems, But I am pretty sure if both companies have released these, then that is something solid to go by
Thanks for your input if you have made it this far You deserve a good pat on the back.
Article: http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen