2017 Feb 26, 21:22:17


Summary

Rainbow Twist

Offline Offline
Age:
N/A
Date Registered:
2014 Jan 23, 23:08:04
Local Time:
2017 Feb 26, 19:22:17
Last Active:
2017 Feb 24, 12:39:29

Profile Comments

Heya!

I am adding you to my buddy list. Hope we can interact more around here.

Hi there!

I just wanted to let you know i added you to my buddy list. :D Hope that was okay.

I'd Like to Say

Your post in reply to LostSanity was beautiful.
Slow clap, sir, you deserve it.

Hi

Hello could I please be your friend?!

..

got home and used hibernated yesterday so I guess no more sleeping

.

I actually haven't tried it with amd only the 780's.
I don't play games much anyway

never had a problem with old games...

even much much older games then that .... always startup in their default native modes centered on the primary monitor for the simple sake of preventing something like that from happening...

...

In positive it worked when I had one 660 ti powering 3 monitors no sli so could
And I know about hibernate

Oh may atcually this isn't the biggest problem
Some games when I start them don't are old or just don't want to work turn the screens black and I am forced to manually restart like halo and Starcraft

Sleeping a computer has never been a flawless venture..

and was implemented due to the excessive times it typically took to start a machine vs resuming from a sleep state... It was mostly implemented and developed for laptops and portable devices. Desktops weren't really 100% intended to use it.

Though a lot of people are going "green" the fact that things have gotten considerably more efficient, a always on computer actually doesn't really use much for power.

It takes me a matter of seconds to get into a fully running windows environment.... it actually takes longer for a resume state to complete than it is just to shut down and start up when required.

I know NVidia and amd are equally effected by this.. not to mention intel with sleep states and resuming on a more complex desktop. Again another reason I never sleep... higher risk of things going "wrong" when using the function.... hibernation is another thing that should actually be disabled on desktops.....

Also

^^^It is a pain since I have to lean over to see small things and messages.

Like if someone starts a surrender in lol I won't see it and messages in other games
So there are disadvantages.
Buy the biggest is by far that if I want to come back from "sleep" I need to turn on my 4th monitor(which is suppose to be disabled btw) and the main screens arent on still and I usually have to on the 4th screen press "detect" so it detects the monitors and they all turn black then I need to unplug one monitor then the displays turn back on and I need to rearrange all the windows.

I am not joking you either and it isn't garbage amd, it did it with the 780's as well

And the price of course needing a far better computer etc

I make fun of my friend all the time with his 630/1440x900 and he calls me an e-peen

5

Blizzard has disabled that for sc2 for that reason unless you use flawless widescreen to fix it

I have played a little mmorpg and haven't found it advantageous besides that you can have plenty of windows open at once but I didn't pvp since I didn't like them

Having ran 5x2560x1440...

in an eyefinity array using a single 7970 6GB eyefinity 6 card pared with a 7970 for crossfire, operating 12800x1440 ultra widescreen resolution, there are true, very few games that allow for such a super wide array to be used, however a few that do provide the proper FOV availability.... that allows you to extend the fov to 150-180 degrees filling a natural vision (modded borderlands to extend it's fov options for example), the bragging right give a clear advantage.

Strategy games with a massive field of vision grant a pile of advantages too. In fact there was plenty of complaints back in the earlier 2000's and only recently has widescreen adoption been fully supported.

one can easily argue in the screenshot you provided of LOE... mmorpgs, specially those heavily pvp based, with such a VAST viewspace you have, you could see enemies be it npc or pc's coming your direction from a range of different angles WELL before anyone with a single screen or someone with less of a viewspace.

The NVidia's 8xx series which is based on as far as I've seen, Maxwell architecture, the new still yet unknown naming convention for the AMD Radeon series which for clarification sake, we'll just say R9 3xx, is based on the last of the volcanic islands and has a remote possibility of showing off a introductory pirate islands chip, if I believe I already stated before, are to launch most likely before the winter holidays (hopefully)....

rapidly

my goal is to see as much as possible.
and frankly it is quite obvious that surround/eyefinity setups are just for bragging rights and to show off
I have yet to find one time where it gets me a great "advantage" like it is made to.
If I played SC2 lots but besides that it is kind of hard with only 2 eyeballs

Small dollar? A 900$ drop?

Next gen? What is that suppose to mean? When Nvidia releases there 880 soon? That would have to mean 1 880= better then a 295

Prices will drop rapidly

my goal is higher DPI.. anything in the 125-150dpi range is suitable for me, after seeing so many 1080p displays for so many years with no improvements being made, it's drab.

21:9 displays are becoming more of the norm.. and when those 11 pixel beasts finally arrive in any number they will drop.. plus Samsung is readily developing a range of 4k displays and those 21:9 displays too, I would say within another year, we'll see 4k drop to small dollar with the 21:9 being in almost the same ballpark, it'll happen suddenly.

The next generation of video adapters will be able to stick a single gpu on a 4k display and hit 60fps without much of any difficulty. Considering the age of my machine which is nearing 3 years, it still has the power (aside from the lack of a modern graphics card) to outperform the fastest socket 1150 intel i7s by a full 60%

More demanding game besides crap dogs

Used to? Why the hell would you ever get rid of that?!?
What kind of job do you have to afford this?

Some Chinese dude did surround on crysis 3 with 3 titans which is perfect since you obviously need shit tons of VRAM and couldn't even max it out and still couldn't get 60 or so
You can look up many more examples excluding crysis and bf4 by just googling 4k surround

I will bet anything you will not get a flawless 60fps on every single game out there excluding the 2 above.

It will take years to come up with such an insanely powerful card to power 3. Of course if you turn down aa and other settings like that

Actually there was even a article on steam something like "sacrifice 60fps to play 4k" etc etc

Also I know it won't be as much but "10320" is all that matters to me and once I went widescreen, 4k didn't look attractive to me until my friend showed me they just released that.
1200$ is a joke so guess ill have to wait 5 years when they cost as much as my monitors or close (around 300)
<3 21:9
And don't forget, those are far bigger 32" then my 25 but I haven't found any stands to fit such gigantic screens.
It will suck balls to have to use generic stands.


And on a side note, my computer can max everything perfectly expect watch dogs but that game is worthless garbage anyway so no tears will be shed and I beat it already

Crysis 3 isn't an ideal engine either

The unreal Elemental Demo (using the unreal 4 engine) was able to provide sufficiently excellent quality and high frame rate.

3x 4k is quivilent to 24 million pixels vs 3x 3440x1440's 15 million.

However as I don't have the 2 other 4k displays at this time anymore (shifted off, waiting)... I do plan however to try and get my hands on the 5120x2160 4K+ displays which are basically the same style aspect ratio as yours, just considerably higher resolution, but this requires displayport 2.0 technology in order to hit 60hz (it's restricted to 30hz atm just like 4k displays running on hdmi 1.4a currently).

11 million pixels for 5120x2160 vs 8 million on 4k.... which means that 3 of them together will basically give you exactly the same pixel count as 8k 16:9 display (7680x4320).

Yes

Yes widescreens are awesome.

And that is the answer to that as well.
The screenshots look thinner and much better then regular.
But 1440 is fine all that matters is vertical anyway


There was some 295 benchmark hitting 50fps in one game on 4k I forget which.
Consolized? I would say it is pretty optimized for me to get 44fps maxed or 70fps w/o AA.
Crysis 3 I can get about 40 maxed but feel free to do a bench mark on 3 4k monitors 100% maxed including aa

2560x1080... 21:9 i see that explains it.

ah I see.. yes...

skipping 2160p .... erm.... the screen you linked is a lower .. much lower resolution than 4k..

with 4k being 3840x2160... the one you linked is roughly 60% of the definition of a 4k display. (8million vs 5 million pixels)

BF isn't exactly a great measurement of overall performance. It's heavily consolized. However games properly developed for the PC can sustain 4k quite easily, for example a Radeon 7750 1GB can sustain 4k @ 60fps with max graphics settings, 0xfsaa or with about 45fps average with 2x fsaa.

...

4? Ewww gross 1920
http://i.imgur.com/pa3Q7AF.jpg

If you play enough you get used to the bezels and can never go back to 1.
3 makes life 100x easier

And you should of saw the horrible cheap monitors which gigantic bezels I used to have.
5760 was to small so I had to upgrade

Your not gonna get 60fps maxed out any time soon on 4k surround.
My pc gets a average of 44fps in bf 100% max just on 7680 that is 2 290x and 1080.
2 295x is not gonna get you a perfect 60 or a card anytime soon

But I'm skipping 2160
http://www.amazon.com/gp/aw/d/B00HG7EB64/ref=redir_mdp_mobile?camp=1789&creative=9325&creativeASIN=B00IJUFGDE&linkCode=as2&redirect=true&ref_=as_li_qf_sp_asin_il_tl&tag=linustechtips-20 far better