Nvidia GTX Titan Lite coming?

New update: looks like I'll probably be getting a "780" (which is what the rumored "Titan Lite" is going to be)

Looks to be about halfway between the Titan and the 680, for $500 less of the price. Perhaps good overclocking headroom because of the fewer shaders as well. Sign me up.

NVIDIA GeForce GTX 780, GTX 770 and GTX 760 Ti
 
Yea i saw that Nvidia 700 series post the other day, looks like i can finally snatch up an upgrade from this 560 ti.
 
Well I think I may upgrade from my current SLI 670s to SLI'd 780s. Am very tempted though to pick up a Titan.
 
The full blown Titan is just stupid to buy; might as well buy a 690.
 
Note the date on the post too, that's the first week when it was released :-D ...
 
DXwhocares.

Everything that matters is what DX the xshit and piss4 are going to support, because you know that'll be the lowest common denominator for the next 5-7 years amirite?

Of course, there will be the occasional atrocity where some game is artificially recompiled for some higher DX version with more expensive shaders that don't contribute any real complexity or profundity to the game design underneath (e.g. crysis, metro, farcry3lul), or some occasional but short and sweet gem from an indie/small studio which doesn't suffer from consolshitis (a la hard reset).

P.S. Oh yeah, let's not forget tesselation, because meshes aren't really detailed enough these days already... Talk about diminishing returns.

It might FINALLY run crysis though. Maybe.

If there's a real "revolution" around the corner it's probably integrating the CPU into the GPU and then running windows 9 on your video card because dedicated CPUs and system memory is so 2013. And being able to write more parallelized software...
 
Of course, there will be the occasional atrocity where some game is artificially recompiled for some higher DX version

Eh you have to take into account the point where it takes too much effort to support newer directx versions from a starting version. I honestly lost interest in DX development when they made it so difficult to import the libraries in non Microsoft compilers. Used to be a simple header conversion. Apparently that has gotten better.

But there comes points where Microsoft has to screw things so bad that there is no point in writing code that works between 2 API's because its the equivalent of writing the engine twice. Its not like the jump from Open GL 2.1 to 3.x wasn't harsh, that was damn near 2 different API's to me. If a company decides to forgo all efforts to support a previous DirectX to save time and money versus the customers they will lose... that's their problem. Can't name a title that has ever forced me to upgrade early because I wanted it THAT bad.

Even when the differences are minimal you still have to write code supporting each version. You can't take DX10 code and compile it to require DX11. You could and this would be a stretch stop the application from running of that version of the library is not installed. But the calls will still all be to existing DX10 API's and have to be supported by whatever library is in play.
 
IDK, they seem to do it somehow. I doubt far cry 3 for example was written with DX11 in mind. I am guessing it was developed with console compatibility in mind, and then at some point in the process (or maybe at the beginning there was "Framework" planned to do this, whatever, same difference) they were like:

"- Oh, btw how can we like make the PC people feel better about the size of their GPUdick?
- I got it - let's take the console version, slap a shitload of AA and ambient occlusion and expensive shaderZ on top of it because shaderZ br?h, make it work on DX11 and tesselate the swimming fucks of all the console-compatible/friendly mesh detail, make a slider or remove all the LOD we spent time doing because of consoles, and then market how much better it looks on PC"

version 2 (BL2 version), add "proprietary particle effects on nvidia libraries because lulz" to the list above.

Sure, it looks "better" (because shaderZ). But it's the same game underneath as it is on consoles - consoles which are now 6 years old and which were already considerably outpaced by high-end PC hardware on the day of release.

Well, I'm kinda getting tired of needlessly expensive shaderZ "for the PC version ONLY yo." There's only so many years I can take of the same shit over and over. It's been going on for what, like 7-8 now since DX9.0c came out?

I want something that requires a minimum of a 660ti or higher (even if it's because of shaderZ. I haven't actually seen shaderZ used creatively for a while, maybe the most interesting thing I can recently think about is the alien vision in NS2 actually), and stresses all 6-cores on CPUs for the features that are integrated in the CORE gameplay. There used to be games in the 90s that you simply couldn't play without x, y, or enough z hardware. They didn't run. At all. Of course, that wasn't good for sales :-) .

Sad thing is, it's always about the lowest common denominator, and most studios only care about money and so the game must run on the PC I bought in 2003 for $500 or "capture X % of the market" and so forth and so forth.
 
I ended up buying a GTX Titan for my new build. Just waiting for it to get here to I can install and fire up the new system.
 
I ended up buying a GTX Titan for my new build. Just waiting for it to get here to I can install and fire up the new system.

Why? I'm not asking to be a jerk, but in games it's not much faster than a gtx 680 so what's the point? It would be different if there noticeable differences in performance but from the reviews I've seen that's not the case, so why spend all that cash on something that doesn't produce? It's like paying a whore to say hello to you, what's the point?
 
I figure that if I can afford it then why not? I guess I will find out what the actual differences are in games when I get it instead of using reviews from various places. I know people who already have Titans and they are quite happy with them. I am sure I will see a difference considering I am switching to a 3 monitor setup at 2560x1440 resolution. I am considering SLIing them as well.
 
Back
Top