Jump to content

cusideabelincoln

Members
  • Content Count

    7,453
  • Joined

  • Last visited

Community Reputation

619 Excellent

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I'm leaning towards JJ being hamstrung by Disney to include all of the elements that were "the fans didn't like THIS in TLJ so we have to have a scene to make up for it", and I wouldn't be surprised if the studio had the final say in the editing process. The movie really does appear like they test screened several scenes, independently of other scenes, and then just cut the favorite ones together to make this movie.
  2. This is useful for competitive players, Pros, or people wanting to take full advantage of a 240Hz monitor. Otherwise it's overkill, and will probably be overkill for a long time to come as devs start hammering our GPUs even harder with ray tracing. Even at 1440p with LOW settings, I am GPU limited with a 1080 Ti in games like COD Warzone and Apex Legends.
  3. Right now you'll be looking at 3 options ~$400-450 Ryzen 9 3900X 12 cores ~>$500 Core i9 10900k 10 cores ~$700 Ryzen 9 3950X 16 cores The i9 10900k will be faster in Photoshop and games, and pretty close but typically slower than the 3900x in video encoding. The Ryzen processors should be faster in most video editing situations (although slower in some), definitely faster in any kind of 3D rendering, and mostly faster in compiling. Of course if your budget is even higher, you can get 18, 24, 28, 32, and 64 core processors.
  4. Yes, AMD is very competitive. They now match Intel in performance per clockspeed, they offer more cores/threads at the same or lower price, and have lower power consumption. Intel's advantage is only in gaming, and that's because of two reasons: Their CPUs can reach a higher clockspeed and they have better memory latency. But considering gaming performance, at 4k or 1440p, is mainly dictated by how powerful your GPU is, there is parity when comparing any $200+ CPU.
  5. Thanks for the replies, they really does help. I'm kind of leaning toward HDR, with HDR10 in particular, just being a spotty standard, as from my limited research Dolby Vision and HDR10+ are superior anyway. The only problem is I never expected HDR to look, subjectively, worse than non-HDR. I have tried adjusting every video setting, but for instance in Thor no amount of tweaking the HDR mode makes Surtur look red like I think he is supposed to be. He's always more on the orange-ish side, and not as bright. So if try to to make the image brighter on HDR10, I can't get the color saturation to change much. And I have tested the HDR10 version of these movies on different apps and different HDMI inputs. The results are fairly consistent across the board in that Dolby Vision looks brighter and more colorful than HDR10, with only minor differences between the devices I'm using. I've tried: Vudu on built-in Roku and Nvidia Shield Movies Anywhere on Roku, Firestick, and Shield Prime Video on Roku, Firestick, and Shield Plex on literally everything: Windows 10 PC, Roku, Firestick, Shield Kodi on Windows 10 I don't yet own a 4k Blu Ray player. I was going to wait it out for the PS5 and use that. But if the PS5 doesn't support Dolby Vision, then it might be pointless and I might need to get a device that playback both HDR and Dolby Vision. Thanks! I'm glad there is some consistency between different panels, in that you guys are seeing the same thing I am in Thor Ragnarok. Also I think I mis-labeled my Star Wars comparison. The darker one is actually Dolby Vision and the brighter one is non-HDR, so it seems like we are getting the same result for that movie too. This makes me feel less worried. I really don't want to box up this TV and send it back. Not only is it a hassle, but like I said when it displays normally the picture looks amazing, and I'd rather not get another TV that could display the same HDR behavior but have worse output due to the panel lottery. I did make a post in the thread about the exact model TV I have, but no one has replied (to me). I am definitely going to search out a general HDR topic, though, to get some opinions. But I'd like to get better quality pictures or video before I do that, to better showcase my example. And nothing was going through a receiver. All my comparisons were done with the Built-in Roku and devices connected directly to HDMI ports. Cool, but I'm not seeing where I can get that app on Roku or Android. Do you have to own an Apple device? Also I think I found a bug. Some of my movies are offered as Dolby Vision in Movies Anywhere, but if I play it my TV goes into HDR mode. If I play back that same movie in Vudu, my TV will go into Dolby Vision mode as it should.
  6. So I bought a new TV recently and I am in the process of diagnosing what I think is a problem with how it displays HDR content. I'd rather not hassle with returning it for a different unit, because SDR and Dolby Vision content look phenomenal on this set. It's a TCL brand, and I've heard they can have quality control issues. The problem I'm seeing is HDR10 videos have a dark tint to them, compared to standard non-HDR versions of the same movie. On the flip side, if a movie is played with Dolby Vision, it looks great and obviously better than the non-HDR counterpart. But the only source of Dolby Vision content I can get right now is on Vudu. So, if at all possible, if anyone has some Dolby Vision or HDR movies on Vudu, and also have those same movies on other apps (Movies Anywhere), could you tell me if there's a noticeable difference in color vibrancy between the video on Movies Anywhere, which AFAIK only supports HDR10, and Vudu, if that movie on Vudu happens to be in Dolby Vision. And to be even more comparable, here are some of the movies I've tested: Avengers: Infinity War (Big differences) -On Vudu with Dolby Vision (DV) it looks great -On Movies Anywhere with HDR it looks dull -Ripped to my local hard drive and played through Plex or Kodi it looks dull -Non-HDR 1080p version looks fine, but not quite as good as DV Thor: Ragnarok (Big differences) -On Vudu with Dolby Vision (DV) it looks great -On Movies Anywhere with HDR it looks dull -On Plex/Kodi it looks dull -Non-HDR 1080p version looks fine, but not quite as good as DV Aquaman (smaller differences) -On Vudu with Dolby Vision (DV) it looks great -On Movies Anywhere with HDR it looks closer to non-HDR -On Plex/Kodi it looks closer to non-HDR -Non-HDR 1080p version looks fine, but not quite as good as DV Star Wars A New Hope (recent 4k release, smaller differences) -On Vudu with Dolby Vision (DV) it looks great -On Movies Anywhere with HDR it looks closer to non-HDR -On Plex/Kodi it seems to have a greyish tint -Non-HDR 1080p version looks fine, but not quite as good as DV Mad Max Fury Road (big difference) -Don't have it on Vudu so can't compare it to to DV -On Plex/Kodi in 4k UHD HDR mode it looks like I'm watching it with lightly tinted sunglasses on -On Plex/Kodi in 1080p non-HDR it looks a lot more vibrant, colorful, an saturated -Playing the Blu-ray on my PS4 it looked the Plex/Kodi non-HDR. Although it was a bit over-saturated because I haven't tuned the PS4 for video playback. And speaking of the PS4, I popped in Spider-man and compared it with and without HDR mode. The HDR mode is definitely overall darker and the colors are less vibrant. Does this happen with anyone else's PS4? I have the base slim model, not the Pro. Hopefully later today I can take some decent quality videos to showcase the difference. But here's a couple of pictures: The Thor Ragnarok comparison is indeed as drastic as the picture makes it out to be. Like Hulk would say, in Dolby Vision Surtur looks like real raging fire while in HDR he looks like a smoldering fire.
  7. After playing Blacklist and then trying to go back to Chaos Theory, I was quite surprised at the difference. The OG games, while revolutionary, are simply not as good as Blacklist. They feel clunky and limiting, and the levels are barren in comparison. If you're not hung up on nostalgia, Blacklist is far and away the best Splinter Cell game.
  8. Buying right now you have to get a B550 or X570 motherboard. Currently AMD is stating they will NOT support their next CPUs on X470, B450, and older motherboards.
  9. Consumer grade 32GB sticks of DDR4 is available, at reasonable prices. We could throw 128GB of system RAM at these games; turn 100GB of it into a RAM drive that will fit a game at a time. Except COD MW. Fuck that 200GB of nonsense.
  10. I have zero experience as well, but it looks like an Amazon service will fit the bill, particularly one of the Compute (c5) instances which they can offer up to 96 cores. I'd say your other alternative is Microsoft, which can offer 120 AMD Epyc core: https://azure.microsoft.com/en-us/pricing/details/virtual-machines/windows/ These services are popular so I can imagine that both Microsoft and Amazon have made the entire process user friendly and simple, with plenty of support should you need it.
  11. They've been the same price for years, for the same tier of performance that you want. There's just no way to make them cheaper as there is no silicon chip that can take advantage of a die shrink and cost less to make. It's pretty much just pure metal with some capicators, and those raw materials have been at cost for a long time.
  12. This is a bit more believable though, because of the nature of the sarlacc pit. As a grown up it's ridiculous to think that you would slowly die over 1000 years, because you'd die in less than a week from lack of water anyway. So if anyone is going to escape the pit, it's going to be someone with a bunch of gadgets and a jet pack like Boba.
  13. I thought they could swing, but not this hard. They are already forcing Intel to price drop their next CPU releases, as they are going to release 4C/8T CPUs and call them i3s. Oh, remember how the lowly i3 was 2C/4T, locked down, no turbo boost, garbage?
×
×
  • Create New...