Jump to content

Looks like 2019 Will Be The Year to Buy OLED


AbsolutSurgen

Recommended Posts

15 minutes ago, Spork3245 said:

 

I can’t imagine that the port is the same and the difference is only the cable, to my knowledge, 2.0 ports are not capable of the higher bandwidth output of 2.1. It would be odd if 2.0 ports are capable of the 48gbps of 2.1 but the cables were limited for no reason...?

 

HDR is a factor as why in gods name wouldn’t you want HDR? :p As mentioned, the chart is confusing, but may suggest that “GSync compatible displays” (Freesync) are not capable of using HDR with VRR enabled and it requires the actual GSync module (maybe).

 

 

I did a little more reading. The port *is* the same. Higher bandwidth support is achieved by two means: increase the clock rate on each channel (the most straightforward thing you can do) *and* allow for a new data format that allows embedding the timing information in each respective channel. By changing to that data format, they're able to use a channel that was previously dedicated to timing information for data.

 

The clock rate isn't an issue (in that we don't need to transmit on a higher clock rate since we've resigned ourselves to low-bandwidth). The data format might be depending on how flexible Nvidia made their GPUs. However, even then that's only an issue *if* using VRR requires transmitting data in the new format. It need not, but I haven't yet found an answer to that.

Link to comment
Share on other sites

1 minute ago, legend said:

 

 

I did a little more reading. The port *is* the same. Higher bandwidth support is achieved by two means: increase the clock rate on each channel (the most straightforward thing you can do) *and* allow for a new data format that allows embedding the timing information in each respective channel. By changing to that data format, they're able to use a channel that was previously dedicated to timing information for data.

 

The clock rate isn't an issue. The data format might be depending on how flexible Nvidia made their GPUs. However, even then that's only an issue *if* using VRR requires transmitting data in the new format. It need not, but I haven't yet found an answer to that.

 

Data needs to transmit and be decoded with Freesync and GSync, it’s why GSync used a hardware solution for faster decoding, no?

Link to comment
Share on other sites

4 minutes ago, Spork3245 said:

 

Data needs to transmit and be decoded with Freesync and GSync, it’s why GSync used a hardware solution for faster decoding, no?

 

Sure. But you can transmit data on existing HDMI interface. The trick for TVs/monitors is indeed the decoding, but that will happen *after* the information is received over HDMI. It's likely done in hardware though and requires good enough hardware which is why you can't just firmware update arbitrary TVs to support freesync or otherwise (that and that you also need to make the rest of the TV pipe work variable refresh speeds).

Link to comment
Share on other sites

39 minutes ago, legend said:

 

Sure. But you can transmit data on existing HDMI interface. The trick for TVs/monitors is indeed the decoding, but that will happen *after* the information is received over HDMI. It's likely done in hardware though and requires good enough hardware which is why you can't just firmware update arbitrary TVs to support freesync or otherwise (that and that you also need to make the rest of the TV pipe work variable refresh speeds).

 

I know that, this is what I was responding to:

46 minutes ago, legend said:

 However, even then that's only an issue *if* using VRR requires transmitting data in the new format. It need not, but I haven't yet found an answer to that.

 

:p 

If your question was in regards to the higher bandwidth of 2.1 being needed, then it may be needed since the 2.1 VRR standard seems to be based on 120hz 4k capability on the little info known at this time.

Link to comment
Share on other sites

32 minutes ago, Spork3245 said:

 

I know that, this is what I was responding to:

 

:p 

If your question was in regards to the higher bandwidth of 2.1 being needed, then it may be needed since the 2.1 VRR standard seems to be based on 120hz 4k capability on the little info known at this time.

 

That's not necessary. Were I the one developing HDMI, the content would be separate from the encoding. That means you can transmit VRR with the "old" data format for 60hz or less and the machinery that knows how to process the information is ignorant of how it came in. It would only be necessary then if you were trying to do VRR at a bandwidth that requires the new format.

 

Interface abstraction of this form is an extremely common design pattern in all of CS and networking. Maybe there is weird niche reason for why HDMI VRR couldn't be done that way, but it's not a given just because VRR can go up to 120hz 4K.

Link to comment
Share on other sites

2 hours ago, legend said:

 

That's not necessary. Were I the one developing HDMI, the content would be separate from the encoding. That means you can transmit VRR with the "old" data format for 60hz or less and the machinery that knows how to process the information is ignorant of how it came in. It would only be necessary then if you were trying to do VRR at a bandwidth that requires the new format.

 

Interface abstraction of this form is an extremely common design pattern in all of CS and networking. Maybe there is weird niche reason for why HDMI VRR couldn't be done that way, but it's not a given just because VRR can go up to 120hz 4K.

 

I just think it may be a standard with more of a "yes" or "no" as opposed to having a wide variance. We'll find out in a few months. :p 

Link to comment
Share on other sites

18 minutes ago, Spork3245 said:

 

I just think it may be a standard with more of a "yes" or "no" as opposed to having a wide variance. We'll find out in a few months. :p 

 

What's more of a standard; VRR up to 120Hz 4K? There's no reason it has to support only that. In fact it would be actively bad design if it required the source to always be able to go up to that given that HDMI serves a very large range of devices which could still benefit from VRR at lower hz.

Link to comment
Share on other sites

6 minutes ago, legend said:

 

What's more of a standard; VRR up to 120Hz 4K? There's no reason it has to support only that. In fact it would be actively bad design if it required the source to always be able to go up to that given that HDMI serves a very large range of devices which could still benefit from VRR at lower hz.

 

As previously mentioned, like previous HDMI revisions, it may be a “yes” or “no” based on the connection. I’m fully aware that the lower the fps is the more important VRR becomes. I highly doubt it will be available on/from non-2.1 compliant devices.

Lastly, if nVidia is going to support it on their current cards, why haven’t they put Freesync Samsung TVs on their GSync compatibly list? It seems odd to wait for 2.1 to release.

:shrug: 

Link to comment
Share on other sites

1 hour ago, Spork3245 said:

 

As previously mentioned, like previous HDMI revisions, it may be a “yes” or “no” based on the connection. I’m fully aware that the lower the fps is the more important VRR becomes. I highly doubt it will be available on/from non-2.1 compliant devices.

 

That's already not true since we have TVs that cherry pick features from 2.1 :p Point is, I haven't seen anything that indicates its not technically possible. There's maybe reason it might not be possible (e.g., it must go through the new data encoding), but thus far, none of us seem to have found anything that indicates that is the case.

 

 

Quote

Lastly, if nVidia is going to support it on their current cards, why haven’t they put Freesync Samsung TVs on their GSync compatibly list? It seems odd to wait for 2.1 to release.

:shrug: 

 

Because as of now they're only talking about DP freesync support and Nvidia is fucking Nvidia :p They won't do any number of things that are otherwise technically feasible. But sometimes they do. And unless we have it confirmed that there is a reason it can't be done, we shouldn't write it off yet. If TV support this year brings enough incentive and it's technically feasible, they might just come around. Right now, it's pretty barren for TV's that require HDMI support and have VRR.

Link to comment
Share on other sites

2 hours ago, legend said:

 

That's already not true since we have TVs that cherry pick features from 2.1 :p

 

They’re using modified ports that will likely identify as 2.1 ports to the port they are connected to :p 

 

Quote

 

 Point is, I haven't seen anything that indicates its not technically possible. There's maybe reason it might not be possible (e.g., it must go through the new data encoding), but thus far, none of us seem to have found anything that indicates that is the case.

 

There’s nothing that indicates that it is possible currently, either, which is why I am using hesitation and and saying “let’s not get our hopes up”. I don’t believe I ever said it wasn’t possible, just not to assume or be overly optimistic about it :p

Especially since nVidia would rather you need to buy a new card to use 2.1 VRR over putting out an update on their current crop even if it is possible :sadsun: 

Link to comment
Share on other sites

9 minutes ago, Spork3245 said:

 

They’re using modified ports that will likely identify as 2.1 ports to the port they are connected to :p 

 

They won't though because it doesn't support all 2.1 features.

 

Quote

 

There’s nothing that indicates that it is possible currently, either, which is why I am using hesitation and and saying “let’s not get our hopes up”. I don’t believe I ever said it wasn’t possible, just not to assume or be overly optimistic about it :p

Especially since nVidia would rather you need to buy a new card to use 2.1 VRR over putting out an update on their current crop even if it is possible :sadsun: 

 

And I never said "assume it it's happening" nor be optimistic. I too have remarked on Nvidia being Nvidia. So what are we talking about then? :p 

Link to comment
Share on other sites

6 minutes ago, legend said:

 

They won't though because it doesn't support all 2.1 features.

 

 

I mean to say that it’ll likely appear as such from the point of bandwidth and recognition, missing features or not: it’ll just be a fake one. :p 

 

Quote

And I never said "assume it it's happening" nor be optimistic. I too have remarked on Nvidia being Nvidia. So what are we talking about then? :p 

 

Not really sure, I said this like a dozen posts ago:

 

22 hours ago, Spork3245 said:

To my knowledge, the GTX 1xxx series are straight-up unmodified HDMI 2.0 ports, though, I’m not certain what’s required for the VRR to be supported, but I do have doubts it’s purely software dependent.

 

12 hours ago, Spork3245 said:

I just wouldn’t get overly hopeful that current HDMI 2.0 video cards will retroactively support 2.1’s VRR or any 2.1 feature until something official is stated.

 

:p 

Link to comment
Share on other sites

9 minutes ago, Spork3245 said:

 

I mean to say that it’ll likely appear as such from the point of bandwidth and recognition, missing features or not: it’ll just be a fake one. :p 

 

Identifying that you support a feature is not identifying as 2.1. And that you can support a featuring without identifying as a "2.1 port" is exactly what I'm describing.

 

Quote

 

Not really sure, I said this like a dozen posts ago:

 

 

 

:p 

 

To which I responded a dozen posts ago

 

12 hours ago, legend said:

I wouldn't say I'm getting overly hopeful, because technically feasible doesn't mean Nvidia will do it. I'm surprised they went this far as it is. But I don't think we should write it off as possible either.

 

:p 

Link to comment
Share on other sites

1 minute ago, legend said:

 

Identifying that you support a feature is not identifying as 2.1. And that you can support a featuring without identifying as a "2.1 port" is exactly what I'm describing.

 

Ehhh, I think we’re having a miscommunication, I’m saying the firmware and the way port is set-up may cause it to show as 2.1 if checked. Much like those early crappy HDMI 2.0 cables that showed up as 2.0 but couldn’t carry HDR.

 

1 minute ago, legend said:

 

To which I responded a dozen posts ago

 

 

:p 

 

Yes and you kept replying with questions and info so I kept replying :p 

Link to comment
Share on other sites

15 minutes ago, Spork3245 said:

 

Ehhh, I think we’re having a miscommunication, I’m saying the firmware and the way port is set-up may cause it to show as 2.1 if checked. Much like those early crappy HDMI 2.0 cables that showed up as 2.0 but couldn’t carry HDR.

 

The ports are the same. The only thing that's different is what information is sent on them and in some cases their encoding. HDMI allows devices to cherry pick features without being fully 2.1 compliant precisely because it's just a matter of being able to interpret and then process that information from that spec and not all features require the higher bandwidth which requires different cables and more hardware support. You can't tell anything from the port alone.

 

 

Quote

 

Yes and you kept replying with questions and info so I kept replying :p 

 

I'm not sure characterizing my subsequent responses as replying with questions is especially reflective. :p 

Link to comment
Share on other sites

On 1/3/2019 at 9:56 PM, skillzdadirecta said:

Glad I waited... might be time to retire the Panny Plasma

 

On 1/4/2019 at 5:49 AM, legend said:

 

This is exactly the TV I would retiring. Thing had some serious fucking legs on it.

 

On 1/4/2019 at 8:10 AM, skillzdadirecta said:

I have a 4K set in my livingroom and a Panny Plasma in the bedroom. If I get an OLED, my current 4K would go in the bedroom replacing the Panny Plasma. Those Pannys are damn near indestructible... it's outlasted TV's I've bought afterwards.

 

I finally wound up giving away the Panny plasma I bought in 2010 and the only reason I stopped using it is because Panasonic announcing they weren't making any more after 2013 induced me to buy a new one in while I still could because I was pretty sure I'd want a bigger TV before OLED was affordable. If not for that I had absolutely no reason to get rid of the 2010 unit. Went 46"->60" and I only regret that I didn't go all the way to 65", but 60" was already pushing it for the apartment I was in at the time.

  • Like 1
Link to comment
Share on other sites

1 minute ago, legend said:

 

The ports are the same. The only thing that's different is what information is sent on them and in some cases their encoding. HDMI allows devices to cherry pick features precisely because it's just a matter of being able to interpret and then process that information from that spec and not all features require the higher bandwidth which requires different cables and more hardware support. You can't tell anything from the port alone.

 

 

Sorry, I’m fighting a sinus infection so I’m seriously miss-wording what I mean: overall I’m talking about the available bandwidth. That is to say, 2.1 VRR may operate purely based on having a 120hz signal available to degradate from - or, maybe it doesn’t, I don’t know. There’s little info on how 2.1 VRR works. I’m merely listing potential reasons it may not be possible to retroactively work at a lower hz from a 2.0 port.

 

1 minute ago, legend said:

 

I'm not sure characterizing my subsequent responses as replying with questions is especially reflective. :p 

 

Eh?

 

16 minutes ago, legend said:

At any rate, I feel like now we're arguing for the sake of arguing and otherwise largely agree. So I'll leave it here :p 

 

I haven’t been arguing at all :p 

Link to comment
Share on other sites

This is really easy guys. Freesync does a doink blink raggem method as opposed to the vrr standard of pi forgo lethongy, what hdmi 2.1 does is a herphompy diggly localop. The cable do hibbity hoppity at the rate of malion pop. 

 

The idea is to get doink blink raggem, pi forgo lethongy, herphompy diggly localop, and hibbity hoppity, all running at malion pop. Then there isn’t any reason to worry about any of this. 

Link to comment
Share on other sites

1 hour ago, stepee said:

This is really easy guys. Freesync does a doink blink raggem method as opposed to the vrr standard of pi forgo lethongy, what hdmi 2.1 does is a herphompy diggly localop. The cable do hibbity hoppity at the rate of malion pop. 

 

The idea is to get doink blink raggem, pi forgo lethongy, herphompy diggly localop, and hibbity hoppity, all running at malion pop. Then there isn’t any reason to worry about any of this. 

 

Sir your pudding pop is most DEFINITELY where it don't belong right now. 

Link to comment
Share on other sites

2 hours ago, stepee said:

This is really easy guys. Freesync does a doink blink raggem method as opposed to the vrr standard of pi forgo lethongy, what hdmi 2.1 does is a herphompy diggly localop. The cable do hibbity hoppity at the rate of malion pop. 

 

The idea is to get doink blink raggem, pi forgo lethongy, herphompy diggly localop, and hibbity hoppity, all running at malion pop. Then there isn’t any reason to worry about any of this. 

 

This was super understandable. Thanks! 👍🏿

Link to comment
Share on other sites

On 1/8/2019 at 2:12 PM, Mr.Vic20 said:

Reading through this thread, I marvel at what a middle american Jabroni would make of all this?! :lol: Good luck marketers! 

 

they are just going to buy a $600 LED from walmart or target which will honestly still provide a solid picture nowadays

Link to comment
Share on other sites

5 minutes ago, elbobo said:

 

they are just going to buy a $600 LED from walmart or target which will honestly still provide a solid picture nowadays

True, very true! In my mind, all of this is stuff that's hard to sell. It lights our collective fires, but is nothing to the general public. On some level the next generation of consoles will have this same problem. I'm curious to see what will define the reason for the next generation of consoles. 

Link to comment
Share on other sites

7 hours ago, stepee said:

This is really easy guys. Freesync does a doink blink raggem method as opposed to the vrr standard of pi forgo lethongy, what hdmi 2.1 does is a herphompy diggly localop. The cable do hibbity hoppity at the rate of malion pop. 

 

The idea is to get doink blink raggem, pi forgo lethongy, herphompy diggly localop, and hibbity hoppity, all running at malion pop. Then there isn’t any reason to worry about any of this. 

giphy.gif

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...