Notice anything different?Firmware 1004.1 has been posted to the Korean site. I installed it over 1003 without any issues.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Notice anything different?Firmware 1004.1 has been posted to the Korean site. I installed it over 1003 without any issues.
Let's say that Samsung releases a 55" 8K ARK model with a 120 Hz panel, or a dual mode "8K @ 60 Hz + 4K @ 120 Hz" panel since that's a thing now. To run 8K @ 120 Hz even on the desktop our current port options would be:
Port 8K 120 Hz 8-bit 8K 120 Hz 10-bit HDMI 2.1 48 Gbps DSC 3.0x DSC 3.75x, DSC 3.0x is just shy of 120 Hz but chroma subsampling would do. DP 2.1 UBHR 13.5 DSC 2.0x DSC 2.5x or 3.0x
So just to be realistic since these are really pushing it, it would be 8K @ 60 Hz for desktop, but a 120 Hz panel could be used for 4K @ 120 Hz gaming. Then you have options:
I don't think anyone is expecting to game at 8K. If anything I would expect many games to really struggle even launching at 8K as their memory management etc might not be up to it. Cyberpunk is a good example where without DLSS, at 8Kx2K it is sub-1 fps level and seems to really get messed up where even getting back to the menu is a challenge.
- Running games at 4K -> integer scaled to 8K. But this would probably limit you to 60 Hz since the res remains at 8K.
- No scaling for 4K at smaller size on the screen. Might have to be able to move the ARK closer to you, which might not be feasible.
- 8K @ 60 Hz + DLSS Performance for 4K render res. Too demanding for AAA RT/PT games.
- 8K @ 60 Hz + DLSS Ultra Performance for 1440p render res. Within reason for AAA RT/PT games, as long as you are ok with 30-60 fps.
- 4K @ 120 Hz + DLSS. Should work for same performance as current 4K displays. Hopefully the display can do the integer scaling tho!
I do agree it's a challenging scenario, but I'd still love to see it become a reality. To me these 77"+ 8K TVs and 32" 8K monitors are the stupidest formats for the resolution. 50-60" would be a great size but it's also large enough that realistically it's your only display and it better be curved too.
Dual/triple 4K screens is still far more cost effective and avoids a lot of issues. I think my next setup might be a 40" 5120x2160 + 27-32" 4K which should avoid most of the problems of the Neo G9 57" while retaining most of its perks.
Dual/triple 4K screens is still far more cost effective and avoids a lot of issues. I think my next setup might be a 40" 5120x2160 + 27-32" 4K which should avoid most of the problems of the Neo G9 57" while retaining most of its perks.
ark
super-ultrawide + 16:9
8k:
XR/MR (marketing pics.. they need higher rez glasses in the future for this to be a real desktop replacement imo)
..
..
..
The Acer is priced the same as the Samsung, so totally stupid when the Samsung goes on sale regularly. It's already down to 2099 € here in Finland and that's likely to be the lowest it will go this year.Just a heads up that the acer version of this screen announced is apparently is limited to 120hz max. When I was originally considering one of these I figured they'd be pretty similar other than that samsung ditches dolby vision but 120hz vs 240hz is a big deal. I haven't heard anything on the acer 57's pricing but Idk how they could ask any kind of similar price range to the G95nc considering that.
https://www.acer.com/us-en/predator/monitors/z57-miniled
The software Samsung uses on the ARK and OLED G9 somehow seems to be a big step back from what they can do on the "older" software used on the G95NC. This smart TV bullshit will apparently not do for example 21:9 + 11:9 on the OLED G9 whereas on all other G9 models this has been possible and frankly a great way to use them for work.Just watched a youtube vid about an owner who has the ark gen2. Watching him mess with dual view just confirmed to me how low 1080p tiles really are on that thing. The ark format will never be a multi-monitor replacement or multi-monitor addition until it goes 8k. The guy in the vid even compares it to the g95nc a bunch of times saying that the g9nc would be better for productivity and multi-monitor style usage due to the pixel density and desktop real-estate since side by side windows on the ark are only 1080p. I knew as much but seeing that video showed the cons of that screen very plainly. It can't do 120hz on more than two inputs/monitor windows either, and when you are using two they are stuck at 1080p each with a ton of wasted black bar/letterbox space. So using 1x3 tiles or 4x4 will probably be 60hz.
Apologies. Hardforum didn't take them for some reason when I added them directly from hdd. It even seems to balk on some imgur links in the last several months too for some reason. I'll make a gallery or something for the monitor array types when I get time.BTW your desktop permutation pics don't work.
6K obviously. 6144x2160 for example.So i got 5120x2160 working on the monitor. Works well, Blbut I feel like its too much space sacrificed on the sides.
Is there a middle ground between 5120 and the native 7680 that I can try?
Thats not true, im sure battlebit runs good too!Good luck running 2x 4K screens for anything other than Minecraft...
Good luck running 2x 4K screens for anything other than Minecraft...
Got a question!
Should I use HDR Tone Mapping active or Static when playing HDR games?
I can't tell if I should enable it or not.
Sure it makes the whole image brighter and somewhat seem more vibrant, but I am unsure if it washes out the image or not.
The curves are arbitrarily decided by LG and don't follow a standard roll-off. There does exist the BT.2390 standard for tone mapping which madVR supports (when using HGIG mode).
LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.
I would use only the 1000-nit curve or HGIG for all movies, even if the content is mastered at 4000+ nits. All moves are previewed on a reference monitor and are designed to look good at even 1000 nits. It's fine to clip 1000+ nit highlights. HGIG would clip 800+ nit highlights.
LG defaults to the 4000-nit curve when there is no HDR metadata, which a PC never sends.
Holy moly, this is far beyond my intelligence to understand lol
So is it better to use tone mapping active or static?
So is it better to use tone mapping active or static?
Depends on the game. There is no right or wrong answer, just what looks best to you. Some might say active tone mapping/dynamic tone mapping is the "wrong" choice but when a game has terrible HDR controls then the picture is already wrong no matter what so it really doesn't matter which option you choose. Here's an example:
View: https://www.youtube.com/watch?v=muvhPqgr48c
IF the game has proper HDR controls then usually the rule of thumb is to use static tone mapping/HGiG but again active tone mapping/dynamic tone mapping ON isn't by any means the "wrong" choice. Here's an example of that:
View: https://www.youtube.com/watch?v=zZSlhTzvuTs
TLDR: Use whatever looks best to you and don't listen to any BS about one choice being right or wrong. There's a reason why both options exist
Ok.
So been testing back and forth with active tone mapping vs static in cyberpunk.
Still find it difficult to determine a winner for me.
But I think I will have to go with active tone mapping, because it seems to bring out brighter light sources yet keep good contrast.
But I have to use these settings:
Monitor settings:
Tone Mapping active
Gamma -2
Shadow detail -1
Ingame setting:
Hdr10+ off
Tone map 1.0
This makes all light sources really glow out of the screen, I like the look of it because I haven't seen anything like it on another display.
But then on another discussion page, I was told I should use Static tone mapping in the monitor setting.
Ok.
So been testing back and forth with active tone mapping vs static in cyberpunk.
@moonlawn 1 month ago
Is it the same reshade preset after patch 2.1 cos they add in HDR10 PQ setting colour effectivenes so mb they fixed the broken hdr settings aswell?
. . .
@plasmatvforgaming9648
1 month ago
Same black level raised because of the same green color filter
I don't set it to greyscale because it would look like shit? If DTM looks like shit then obviously I wouldn't use it but that isn't the case. There are simply times when it looks BETTER than HGiG, "accuracy" can kiss my ass. You can write a 20 page essay on why dynamic tone mapping is bad if you want, it's still going to exist and people will enjoy using it.
Most of the games you mentioned have broken HDR to start with so options are limited. I wouldn't call the default static tone mapped HDR on those more accurate or say they don't need some kind of adjusted mapping (thus the info on Reshade HDR filter to change the Max CLL for static tone mapping and target peak luminance for broken HDR games). I don't blame you at all for trying anything on those broken HDR games, including DTM. And you can set your TV OSD and graphics settings how ever you like best.
.
So is that video of adjusting Max CLL static tone mapping and target peak luminance only for Cypherpunk 2077 or is that applied to other games aswell? What about Alan Wake 2 HDR?
I really like the light source glowing effect in Cypberpunk from DTM active, it really pops!
EDIT:
Ahh man, i think I made my mind up. I am definitely not changing from active tone mapping in cyberpunk. Just been walkimg around night city, max RT, PT, RR etc etc and holy moly the visuals, the lighting, is mind blowing on this monitor with these settings! I have never seen anything like it! I just cannot ever go back to oled or 16:9 ever again! Well atleast not anytime soon.
Now I can't wait until RTX 5090 to come out now and fingers crossed its atleast a 20% increase in fps @ 8K!
So is that video of adjusting Max CLL static tone mapping and target peak luminance only for Cypherpunk 2077 or is that applied to other games aswell? What about Alan Wake 2 HDR?
I really like the light source glowing effect in Cypberpunk from DTM active, it really pops!
EDIT:
Ahh man, i think I made my mind up. I am definitely not changing from active tone mapping in cyberpunk. Just been walkimg around night city, max RT, PT, RR etc etc and holy moly the visuals, the lighting, is mind blowing on this monitor with these settings! I have never seen anything like it! I just cannot ever go back to oled or 16:9 ever again! Well atleast not anytime soon.
Now I can't wait until RTX 5090 to come out now and fingers crossed its atleast a 20% increase in fps @ 8K!
You can reach this monitor's max refresh rate over DisplayPort only if you have a DisplayPort 2.1 graphics card, as we used an AMD RADEON RX 7800 XT, and you need to either use the included DisplayPort cable or any DP 2.1-certified cable that's shorter than 1.5 m (5 ft).
However, connecting over HDMI isn't so straightforward. You need an HDMI 2.1 graphics card and connect to HDMI 2 and 3 as they support the full 48 Gbps bandwidth of HDMI 2.1, and HDMI 1 is limited to a max refresh rate of 120Hz at its native resolution. Although there aren't issues reaching the max refresh rate and resolution with 8-bit signals, not all sources support the 240Hz refresh rate with 10-bit signals. The max with an NVIDIA RTX 4080 graphics card is 120Hz, and that's only when setting the Refresh Rate in the monitor's OSD to '120Hz'. Setting it to '240Hz' strangely limits the 10-bit refresh rate to 60Hz, as you can see here. While using the RX 7800 XT graphics card, the max refresh rate with 10-bit was 240Hz though, as it uses Display Stream Compression.
"The VRR support on Samsung G95NC works best with DisplayPort 2.1 or HDMI 2.1-compatible graphics cards, like the AMD RADEON RX 7800 XT. You get the full refresh rate range, and it supports Low Framerate Compensation for the VRR to continue working at low frame rates. The refresh rate range is limited on NVIDIA graphics cards that don't support DisplayPort 2.1, though, as the max refresh rate with an NVIDIA RTX 4080 is 60Hz with the native resolution. You need to change the resolution to 1440p or lower to get the max refresh rate of 240Hz. Additionally, G-SYNC doesn't work with an NVIDIA RTX 3060 graphics card, as there isn't even an option to turn it on."
And with that, you'd assume that an incredible over-the-top beast of a gaming monitor would demand to be paired with the most powerful GPU currently available - the NVIDIA GeForce RTX 4090. Well, it turns out there is one major problem. Connecting NVIDIA's flagship gaming GPU with Samsung's Odyssey Neo G9 limits the 8K (or dual 4K) output to 120 Hz.
Being able to deliver an 8K ultrawide image (7,680 x 2,160) at 240 Hz, it seems like the GeForce RTX 4090 can't keep up. What makes this a little strange is that even though the GeForce RTX 4090 doesn't support the new DisplayPort 2.1, the full HDMI 2.1 spec of the card should theoretically have enough bandwidth to support 240 Hz - as its possible with the Radeon RX 7900 XTX over HDMI 2.1.
Samsung Odyssey Neo G9 GPU support
One reason the card might not support the full resolution of the Samsung Odyssey Neo G9 could come down to the "Display Stream Compression" (DSC) technology used for high-resolution and multi-display output - with Reddit user Ratemytinder22 noting that it's hitting a bottleneck when pushing the output through a single port.
NVIDIA's specs for the GeForce RTX 4090 list the maximum capabilities as "4 independent displays at 4K 120Hz using DP or HDMI, 2 independent displays at 4K 240Hz or 8K 60Hz with DSC using DP or HDMI." Could support be added as part of a driver update? That remains to be seen.
The only GPUs capable of supporting the Samsung Odyssey Neo G9 at the full dual 4K 240Hz are AMD's Radeon RX 7000 Series - and the flagship Radeon RX 7900 XTX. AMD's new RDNA 3 generation supports the new DisplayPort 2.1 spec, which offers enough bandwidth, while NVIDIA's Ada generation does not
Read more: https://www.tweaktown.com/news/9342...240-samsung-odyssey-g9-neo-monitor/index.html
All I know is that it switches the EDID to a different one. Usually on Samsungs the highest refresh rate EDID is for some reason much more barebones than the "regular" one, with less refresh rate options defined at least.Curios, do we know yet what the 120 hz vs 240 hz setting on this (and other Neo G9s) actually do and why there is even a need to have the swtich? The fact that the settings exists and is default set to 120 hz must mean there are some downsides to enabling the 240 hz mode. I remember that once the 240 hz mode was enabled, it was no longer possible to use 120 hz in native resolution on my Nvidia GPU, even though it was possible with the settings set to 120 hz.
21:9 = 5040 x 2160So i got 5120x2160 working on the monitor. Works well, Blbut I feel like its too much space sacrificed on the sides.
Is there a middle ground between 5120 and the native 7680 that I can try?