Thanks to Ridge for sponsoring today's video! Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS
@theboogerbomb
Жыл бұрын
thank you ridge
@nukeshine
Жыл бұрын
ridge sucks!!!!!!!!
@gibbyhale4217
Жыл бұрын
Why does it look like inside of a vagania
@YTxGalaxy
Жыл бұрын
6:43 "Top 5 - 10%" , me whos top .1% & still havent become a pro player...
@0opsyt
Жыл бұрын
if you render a little bit off screen you might not need stretching
@tubehellcat
Жыл бұрын
"He owns a display" - that's gotta hunt him for ever like the "you're fired" for Colton 😂 Love it 😁
@fsendventd
Жыл бұрын
which video is "he owns a display" from again? having a hard time finding it
@Jeffrey_Wong
Жыл бұрын
@@fsendventd He's been doing a lot of monitor unboxing videos on ShortCircuit, I think it's from one of the Alienware monitor videos
@GigaSeadramon
Жыл бұрын
@@fsendventd it's from the 8k gaming video
@AnuraagDaniel
Жыл бұрын
@@Jeffrey_Wong yeah, it's also a direct quote, he says "I own a display" in the dlss 3.0 video
@TwinShards
Жыл бұрын
Yeah i got a really good laught out those 4 words under his name.
@QuaziInc
Жыл бұрын
I feel like I just had my mind blown wide open at the possibilities. This is one of my favorite ltt videos. Its hard for me to find such a technical concept so well explained. well done.
@matiitam111
Жыл бұрын
On the other side: this is a static scene, no animated textures, no characters moving around, no post process effects, no particles ect. Porting this to modern game would be similar to what assassins creed syndicate (or later one, I don’t remember) did to clothing physics: capped it at 30fps while game run at 60. Effect would look similarly to what modern games do to animations when characters are too far for game engine to update them as frequent as game current fps. So I’m skeptical Also, nice GPU you got there, can’t wait for the review ;)
@jacobc5747
Жыл бұрын
it is effective in VR games designed with it, so it will probably be effective on normal PC games if it's kept in mind.
@kazahesto
Жыл бұрын
Yeah, this isn't new tech. Pretty much every web browser does something similar when scrolling or zooming, where most content is static, and it looks terrible when a heavy webpage tries to do parallax scrolling on an under powered system. The whole "no body thought about it" angle in this video is strange and patronising.
@methejuggler
Жыл бұрын
I'd imagine that a lot of the edge stretching could be mitigated by rendering slightly more than is displayed on the screen, so there's a bit extra to use when turning before having to guess
@Blancdaddy
Жыл бұрын
that's an interesting thought. this would bring us back to the age of overscan i feel lol
@RyoLeo
Жыл бұрын
@@Blancdaddyreject dlss return to overscan
@LasticDJ
Жыл бұрын
I think I remember 2kliksphilip talking / showing this in his video, just have the part just outside of your fov rendered at a lower resolution and use that instead of most of the stretching because you can't see the detail anyway
@odinsplaygrounds
Жыл бұрын
I just commented something similar. Just have it render out 10% extra which is cropped off by your display anyway, so whatever "stitching" it's doing is outside your view. Would love to see this. Combine that with foveated rendering, so the additional areas rendered outside view is lower resolution.
@Blancdaddy
Жыл бұрын
imagine the longevity of GPUs if this technology ever becomes standardized. man.
@splatlingsquid4595
Жыл бұрын
I absolutely love the recognition that kliksphilip and his brothers have been getting. It really is an amazing idea and would make everything so much better!
@KingMuttley
Жыл бұрын
This can make a huge difference to desktop gaming for sure, mostly in the budget and aging mid-tier builds, but can you imagine what this could do for the Steam Deck and other handhelds? Not just make games smoother, but the amount of battery life this could save would be a huge advantage. Would love to see how far this tech can be pushed and one day even become as widely adopted as DLSS and FSR
@arconreef
Жыл бұрын
I think the reason this feature hasn't caught on is because most gamers are satisfied with 60 fps in single player titles, and inputs being decoupled from enemy position is not something developers of competitive multiplayer games want to deal with. Even the best designed multiplayer shooters have hitreg issues, and asynchronous reprojection would make that problem even harder to solve.
@Vanayr
Жыл бұрын
I swear I know the “like our sponsor!” line is coming but I laugh every time.
@millermason2068
Жыл бұрын
I’m convinced that this channel has a team of people dedicated to making Segway sponsors
@MrBeetsGaming
Жыл бұрын
I've been wondering why this wasn't a thing for a long time but I never actually knew how it worked so I just assumed it wasn't a possibility.
@SparkRattle
Жыл бұрын
Now more than ever in my life, I want a yorkshire terrier named Linus.
@timquestionmark
Жыл бұрын
To avoid confusing we should all agree to call the actual framerate the framerate and the rate of the async projection something like "projection rate"
@DeathlyShadowXD
Жыл бұрын
This would be great for singleplayer rpgs as I usually play for the visuals
@ImHeadshotSniper
Жыл бұрын
i thought it was genius when 2kliksphilips described separating the camera fps from the game fps in order to reduce input lag entirely
@bloodraven2537
Жыл бұрын
Hmm, couldn't we use AI that can predict camera movements and tell the GPU to render a bit more in the direction in which the camera is moving for the next frame, to reduce that stretchiness at the edges? Or just have an option to render a bigger square than our screen - it might come with some minor performance hit, but it will make the technique even more seamless.
@coocooman
Жыл бұрын
Imagine Async combined with dlss machine learning frames.
@Ph42oN
Жыл бұрын
If this kind of stuff would be available in most games, maybe it would actually make sense to buy 240 or higher Hz monitor.
@zzzjz8437
Жыл бұрын
Ok hear me out what if we combined checkerboard rendering and interlaced rendering at half the res and then upscale it w fsr 2.1 so basically render a whole frame w 12.5% or even less pixels !
@rodrigoteles1409
Жыл бұрын
Plouffe's "He owns a display" gag is always going to crack me up.
@Lu-db1uf
Жыл бұрын
Don't all of them own displays? It's a tech media company, I'd hope they do.
@Fay7666
Жыл бұрын
There was a video a couple of weeks ago about his _display._
@bigdoggo5827
Жыл бұрын
@@Lu-db1uf You know.. that's the joke
@reeepingk
Жыл бұрын
@@Lu-db1uf But his display is.... *special*
@thebyzocker
Жыл бұрын
@@Lu-db1uf he bought the alienware miniled one and hes proud that he was one of the first to get it and now its a meme
@Blap7
Жыл бұрын
2kliksphilip and LTT is a crossover I never knew I needed. Make it happen.
@stupot46
Жыл бұрын
Bump lol
@Mraz565
Жыл бұрын
Wonder if it can be used with CSGO, whether Valve allows or you brute force it.
@morfgo
Жыл бұрын
They won't They just use him and his ideas with not even 1 full second of credit
@mipacem
Жыл бұрын
@@morfgo its not malicious
@isaacolukanni
Жыл бұрын
@@morfgo Dude, they literally credited him and his video in the description!
@mauromerconchini
Жыл бұрын
I'm so happy Phil put a spotlight on this concept, and I'm even happier that a channel like LTT is carrying that torch forwards.
@SirDragonClaw
Жыл бұрын
I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project. I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly. I should find my old demo and see if I can get it compiling again.
@tillson8686
Жыл бұрын
I asked in a VR subreddit about a year ago why nobody is making Async for computer games and people gave me shit about it like "wouldn't work that way, the idea is stupid, just not possible, etc." so I gave up. Glad I asked the right people
@InfernosReaper
Жыл бұрын
There are a lot of people who like to do the seemingly safe bet of saying "it won't work" without actually knowing, because they aren't the experts they want to pretend they are. If a person speaks in absolutes without even trying explain why, chances are, they are not truly experts. They might know some things and even genuinely think themselves to be experts, but in reality, they have much more to learn.
@kristmadsen
Жыл бұрын
People that reply to things on the internet tend to respond that way to new ideas.
@kazioo2
Жыл бұрын
Maybe you got this answer because it was discussed and tested like a hundred times since John Carmack invented it in 2012. There are serious issues with this that are much less problematic in VR.
@ffwast
Жыл бұрын
The right idea is not asking redditors anything.
@DJayFreeDoo
Жыл бұрын
@@kristmadsen it was the same when the first computer mouse was invented. the higher ups said its useless, why would anyone need this? And bam! everyone has a mouse or a trackpad
@thepillowmancer
Жыл бұрын
Not mentioned in the video: you can render frames at slightly higher fov and resolution the the screen, so that there's some information "behind" the monitor corner. Won't save you from turning 180 degrees, but it will fix most of the popup for a very slight hit on performance
@lyrilljackson
Жыл бұрын
this is not what pc gaming is supposed to be about. using vr handmedown techs. and vr and pc scene shouldn't be segregated and minding their own scene either if yall sudden mindlessly hurrah at this crossover weirdofiesta
@martinkrauser4029
Жыл бұрын
@@lyrilljackson What do you mean?
@lyrilljackson
Жыл бұрын
@@martinkrauser4029 asmh
@JustSomeDinosaurPerson
Жыл бұрын
@@lyrilljackson Brainlet take
@latinodollar
Жыл бұрын
@@JustSomeDinosaurPerson Careful... thats a [Lvl. 163] PC Master-Supremacist, the bane of mobile, console, and vr gamers.....
@Nabalazs
Жыл бұрын
I am so happy that Philip managed to get the message THIS far out. I do fear that this tech might have issues with particles and moving objects and the like, but when you mentioned that we could use DLSS to ONLY FILL IN THE GAPS, my jaw dropped. Thast so genius! I really hope that this is one of those missed opportunity oversights in gaming, and there isnt like some major issue behind it not being adopted yet.
@antikz3731
Жыл бұрын
Exactly. On Linux this exact setup has been available for the last year. It makes a massive difference
@hubertnnn
Жыл бұрын
You don't need to worry about particles, just render them later. The whole idea behind this solution is to split rendering into two phases: 1. Render the scene (expensive 3d phase) 2. Render the final frame from pictures of the scene (cheap 2d rendering) Just move all particle and HUD rendering to phase 2 To be honest I would suggest to go even further and add phase `1.1` where you use DLSS to draw the less important background stuff, this way you can render the important objects in 4k and background objects (buildings, grass, trees etc) in 720p or lower and just upscale with DLSS. Or go even further and render each layer in different framerate. Background in 30fps, while objects in 60fps and final image in 120fps
@GeneFJacket
Жыл бұрын
This was my first thought, too. Combining Async with DLSS/FSR could potentially be the actual magic bullet we're looking for.
@MrDavibu
Жыл бұрын
@@hubertnnn I mean cheap is relative, if all animated and moving objects and particles have to be rendered later on it won't be that cheap. Especially if it means that transparent object have to be rendered after that. Also screen-space-reflections of animated objects will disappear if they are not part of an animated object itself. Not saying it's not interesting, but it's definitely not a solution without compromises.
@tralphstreet
Жыл бұрын
@@antikz3731 ??? Explain. I use Linux and there's no such thing.
@moonsicklegaming7990
Жыл бұрын
2kliksphilip is an unsung hero, his DLSS coverage is also one of his best content
@MrChanw11
Жыл бұрын
His upscaling content is the best ;)
@Aeroxima
Жыл бұрын
Never seen either of those but I agree
@Diie89
Жыл бұрын
Personally super excited to see 2klicksphilip's video referred to in a LTT video, a lot of Philip's content is really high quality, especially the ones where he covers DLSS and upscaling as mentioned earlier. Can't recommend checking it out enough!
@elise3455
Жыл бұрын
2kliksphilip had a good idea, but 3kliksphilip is more advanced in every way!
@HonoredMule
Жыл бұрын
@@elise3455 3klicksphilip is just more work. Both will be _automatically_ obsolete when 0clicksphilip releases.
@heeerrresjonny
Жыл бұрын
You might be able to hide a lot of the edge warping by basically implementing overscan where the game renders at a resolution that's like 5-10% higher than the display resolution, but crops the view to the display resolution. It should in theory be only a very minor frame rate hit since you're just adding a relatively thin border of extra resolution.
@carlo6953
Жыл бұрын
The size of border you would need to eliminate the edge warping would probably impact performance more than just using a higher refresh rate to lower the amount of warping in the first place.
@ABonYT
Жыл бұрын
The magic combo there would be foveated rendering alongside the async reproj with overscan. The games that would make sense for will inevitably be a case-by-case thing for but the performance gains would be massive.
@batuhancokmar7330
Жыл бұрын
@@carlo6953 That assumes you'd need same resolution for overscan. If game is rendered at 45deg FOV at 1440p, render an overscanned area between 45 and 90deg FOV at 360p. You don't need a lot of detail, just something to make valid guesstimates within that motion blur until proper frame fills up the screen.
@WiggyWamWam
Жыл бұрын
Yes, definitely. Surprised they don’t do this.
@ccd03c
Жыл бұрын
I’m glad I wasn’t the only one thinking this
@cometor1
Жыл бұрын
Yay, 2kliksphilip and his brother 3kliksphilip finally get some well deserved attention!
@kazioo2
Жыл бұрын
The inventor already suggested using it for normal games in 2012. Then many people made experiments and demos over the last decade. This one finally got some traction, so kudos for that, but it's nothing new.
@cometor1
Жыл бұрын
@@kazioo2 truth to be told, I'm a long time klik empire supporter, and I'm always happy if anything good happens to him like getting mentioned by another creator I like. The technology is interesting, and it needs traction to take off, but i actually care more about Philip than the tech.
@semick4729
Жыл бұрын
Brother?
@MrALjo0oker
Жыл бұрын
@@semick4729 yeah he has two brothers kliksphilip and 3kliksphilip
@saladgreens912
Жыл бұрын
@@MrALjo0oker Not sure if that is necessarily true, someone should get to the bottom of that. Valve, please fix.
Жыл бұрын
3kliks and LTT collabing is what the INTERNET NEEDS!!!!
@Qimchiy
Жыл бұрын
2kliks but either way Yes.
@RennyChuggs
Жыл бұрын
no it isnt
@cora2887
Жыл бұрын
@@Qimchiy its the same guy
@pair_of_fins
Жыл бұрын
@@cora2887 Erm, if you paid attention you would know they were brothers 🙄
Жыл бұрын
@@Qimchiy might as well get kliksphillip in here too ;)
@boysenbeary
Жыл бұрын
As someone who plays VR constantly, it’s nice to see this brought up for non-VR stuff
@KalkuehlGaming
Жыл бұрын
I know philip will see this and I know he will feel awesome. You have come a long way Philip. I am proud to be part of your community since your first tutorial videos.
@charliegroves
Жыл бұрын
Here's to Philip, love his videos on all 3 of his channels
@TuRmIx96
Жыл бұрын
Love him. His tutorials layed the base for my environment artist gamedev job.
@fargoththemoonsugarmaniac
Жыл бұрын
kliki boy i love you
@jo_kil9753
Жыл бұрын
@@charliegroves more like 14 lol
@monkeywithocd
Жыл бұрын
This is great, it really explains some odd behavior I've noticed while playing VR games, and using it for flat games sounds like an awesome idea, especially for consoles.
@TechnologistAtWork
Жыл бұрын
Handhelds too. This would make any game on the Switch or Steam Deck run near perfectly without having to tap into too much hardware power. Why are we not funding this? GPUs are the size of a gaming console nowadays but they couldn't bother to solve those issues with much simpler and cheaper solutions?
@nktslp3650
Жыл бұрын
Yes ! Sometimes when the game is stuttery, you can still move freely but you can see the black screen ! Such a cool tech. It works really well, input latency is really important.
@nebelwaffel8174
Жыл бұрын
@@nktslp3650 yeah, when he showed the black borders I had a strong feeling of "i have seen this before", but i couldn't put a finger on it, until he mentioned VR.
@TheTrainMaster15
Жыл бұрын
Philip is revolutionising the way we think about gaming and game dev just with common sense
@Neurotik51
Жыл бұрын
what? nothing here is new
@TheTrainMaster15
Жыл бұрын
@@Neurotik51 using technology for VR with conventional monitors? I haven’t heard of that before
@Lunch-b0x
Жыл бұрын
been watching 3kliks for years. I'm glad he's getting some recognition.
@TehF0cus
Жыл бұрын
2kliks
@GreenzQe
Жыл бұрын
@@TehF0cus kliks
@veganssuck2155
Жыл бұрын
@@TehF0cus same person
@nbt1254
Жыл бұрын
@@TehF0cus but don't confuse him with his evil brother kliksphilip
@2opmataron991
Жыл бұрын
@@veganssuck2155 Its a joke....
@malgaras6204
Жыл бұрын
Seems like interesting tech. Two immediate thoughts: 1. What about moving objects? Seems like the illusion falls apart there as this only really simulates fake frames of camera tilt, not any changes to things already in your FOV. 2. What if you just slightly over-rendered the FOV? Then you actually have some buffer when tilting the camera where you have an actual rendered image to display before you need to start stretching things at the edges of the screen. Now obviously since you're rendering more geometry, you are going to take a further FPS hit, but is there a point where the tradeoff is a net gain?
@user9267
Жыл бұрын
2. You can do that I think. 1. They will move at the actual framerate. All asynchronous reprojection does is make the game feel more responsive.
@Sunisway
Жыл бұрын
I mean vr uses it and it runs pretty well
@Pro720HyperMaster720
Жыл бұрын
In the 2kliksphilips video he mentioned how interesting would be instead of stretching the borders or showing the void leaved by not yet rendered, if we rendered a bit extra of the display area (quasi like overscan) but in low resolution to impact as little as posible the performance, and as our peripheral vision is not great we barely noticed when moving fast that a small area in the corners is momentarily lower resolution. So yes, we would have plenty of ways to improve the illusion, for example you could boost the ondisplay area with DLSS or FSR, and maybe even the extra area (I don't think that always would a good idea, depending on your main resolution is not the same that the extra area is 480p and you're creating that image from 240p with DLSS that it to be 240p and get that from 120p, probably a bad idea at least for the ladder), maybe if the resolution on the extra area is not suitable for DLSS or FSR, you could use for on display area but the frame generation of DLSS 3.0 (and future FSR 3.0) but only the frame generation on the extra area to fill the gaps with mostly fake frames gotten of deducing what your movement would lead to showing and anticipating to it
@chanceslaughter3237
Жыл бұрын
moving objects still have poor framerates, that's how it is in vr as well, your hands are much more jittery feeling than the rest of the game when your fps drops... in my experience anyway.
@reptarien
Жыл бұрын
1. Yes, moving objects are still noticably 30fps, but coming from someone who has spent time in VR reprojection in a game like Skyrim, with lots of moving actors, you don't notice that nearly as much when your actions are still so instant, as shown in the demo. It's crazy how much you can find yourself forgiving if your head and hand movement is still smooth as butter. 2. That is another technique that VR absolutely uses that works very well to solve that issue. Easily implementable and workable.
@palaashatri
Жыл бұрын
Take this a compliment : I love how LTT has now transformed more into a Computer Science/Electronics for Beginners channel than just another "Hey we got a NEW GPU [REVIEW] " channel.
@PrograError
Жыл бұрын
Well... They had covered everything on that aisle...
@sirspamalot4014
Жыл бұрын
It's why I keep watching them, I got tired of watching reviews of hardware I can't afford/don't really need yet. Though my VR rig is getting very tired.
@randxalthor
Жыл бұрын
This is probably my favorite type of video from LTT. Highlighting and explaining interesting technology is fascinating.
@deolamitico
Жыл бұрын
oh wait, time for another balls to the wall computer build! only the third this week. /s But for real, they've been doing a great job with not doing what I just said
@thebaum64
Жыл бұрын
it's up there for sure
@lucasthompson6405
Жыл бұрын
This feels like a slightly hacky optimisation you would see in older games, and I personally find that really cool. I always admired hearing the cleaver ways game devs to over come the limitations of hardware. Where as these days it feels like we rely on an abundance of processing power. That abundance of processing power is generally a good thing but it feels like these sorts of optimisations are becoming a lost art
@jovieasyrof2017
Жыл бұрын
cough cough Gotham Knight
@hkoizumi3134
Жыл бұрын
This explains the weirdest I've felt in VR. The game itself lagged for random reason but my head tracking and responsiveness of the control wasn't affected. I remember thinking if the head tracking lagged along with the other lag, I would have had severe motion sickness.
@HappySlappyFace
Жыл бұрын
Yeah it was honestly a very amazing thing when my quest 2 froze, I was like "oh no please no motion sickness" for the first time but it was so normal
@rpavlik1
Жыл бұрын
Yep, it's pretty standard and required for hmd based VR. (At least some sort of reproduction or time warp) There are a lot of different variations.
@Bonez32186
Жыл бұрын
Realistically, if you render outside the fov that you are doing by a percentage, you would have enough scene overshoot for it to not really be a problem unless you have extremely low frame rates and incredibly fast movements.
@KingBowserLP
Жыл бұрын
One thing that Philip's video covers that this one does not, and which I'm personally really excited about: combining this with a low shading rate border around the viewport (the fully rendered frame). Since peripheral vision is more trained on movement than detail, this is fine quality wise, and it means the screen doesn't have to guess what's at the edges - the information is already there, just in lower quality than the main viewpoint would have. That would, if not eliminate, significantly reduce the stretching artifacts.
@666Tomato666
Жыл бұрын
like doing actual FOVeated rendering, where the "sharp part" is the whole normal viewport, while the low resolution is just around it, like extra 5-10% or so
@ffsireallydontcare
Жыл бұрын
I haven't seen Philip's video but I'm guessing you'd need eye tracking as well. It'd be pointless to render the fringes of the "screen" at a lower quality if you can point your eyeball directly at it...
@gabrielenitti3243
Жыл бұрын
@@ffsireallydontcare what if the lower quality rendered parts are actually outside your screen? You would trade a bit of framerate for more accurate projection predictions which would recoup the lost performance and give you a better experience
@rileyn2983
Жыл бұрын
@justathought No because it will be fixed in 1/30th of a second. It's obviously not perfect, but that's what this technique is about, compromises. Lower resolution fringes would be way better than stretching.
@ffsireallydontcare
Жыл бұрын
@@gabrielenitti3243 Ahh ok, yeh that makes more sense.
@mrogalski
Жыл бұрын
Just think about that: You can see the difference on a KZitem video! Granted it's 60FPS but it's still compressed video streamed from KZitem. I can only imagine how much of the difference you can see live running it yourself. This makes it even more amazing!
@twinklesprinkle1318
Жыл бұрын
I was amazed watching Philip's video when it came out. I'm happy that it has reached you now! Hopefully the game developpers will get the message, I'd be really happy to see this implemented in actual games, because at the moment unless you have the most recent hardware, you have to choose between high resolution and very high framerate...
@SendFoodz
Жыл бұрын
I wanna see the video in question, 3klinkphilips is the channel right? what's the video? Im guessing around to find this guy/video, what's the title so I can show him some love?
@Wanklacus
Жыл бұрын
@@SendFoodz it's linked in this video's description my man (if you haven't found it yet)
@odytrice
Жыл бұрын
I literally did a spit take at 6:01 Now I have coffee all over my keyboard 😂😂
@stephenmurray5276
Жыл бұрын
Y’all have done a stupid good job recently researching and explaining difficult concepts. Between this video and the recent windows sleep/battery video, my (already high) respect for LMG’s tech knowledge has gone through the roof! And y’all didn’t even discover this hack! Thanks for sharing (and explaining)
@MrPaxio
Жыл бұрын
older videos were more technical now they suck up to the chump who doesnt know how to navigate a settings menu
@mirage809
Жыл бұрын
As per usual: John Carmack is the king of optimizing rendering in games. He first implemented this tech for the Oculus Rift and has a long history of coming up with awesome solutions for problems like this. This is the man that made Doom, he knows his stuff. He's probably laughing right now and having a big "I told you so" moment.
@Felipemelazzi
Жыл бұрын
John Carmack is the responsible for asynchronous reprojection!?!?!?!?!?! This living god never stops to amuse the world of technology!
@imdurc
Жыл бұрын
@@Felipemelazzi I thought JC was the one who had seen it somewhere and wanted to bring it to Oculus, but, I don't think he was responsible for its actual creation. Anyone know?
@HamguyBacon
Жыл бұрын
he basically mimicked how your eye works in real life, I thought of this too but i thought it was already implemented.
@CJMAXiK
Жыл бұрын
Meta improved on this tech, now it is called Asynchronous Spacewarp and bundled with Oculus Quest 2. And let me tell you, it is really cool.
@BlameDavid
Жыл бұрын
I'm so happy to see phillip reach this far out of the csgo bubble with this
@OriginalityDaniel
Жыл бұрын
'valve please fix'
@afroninjaen
Жыл бұрын
I always had a feeling that tech like this is actually the real future of gaming / VR performance. And not just raw rtx 4090 performance.
@Thezuule1
Жыл бұрын
This tech has been a part of VR for years and it's awful. They need to take a new approach and have developers actually implement it at the game level rather than it being an after effect because as it stands now it doesn't work worth a shit. Awful.
@-NoodleBoy
Жыл бұрын
@@Thezuule1 I use it in RE8 vr so I can run rtx while in vr and it doesn't feel great but it feels better than native.
@possamei
Жыл бұрын
@@Thezuule1 On quest, they've built support for it in-engine, it's called SSW. It's actually better than ASW on PC because it has motion data for the image, so the interpolation is quite good. Sure, real frames are still better, but the tech is getting better
@Thezuule1
Жыл бұрын
@@possamei you've got that a little twisted up but yeah. SSW is the Virtual Desktop version, AppSW is the native Quest version. It works better but still not well enough to have picked up support from any real number of devs. Step in the right direction though.
@DJayFreeDoo
Жыл бұрын
@@Thezuule1 But what if DLSS and FSR only had to correct the flaws of this instead of making whole frames. DLSS and FSR might get you even more performance.
@alexmathewelt7923
Жыл бұрын
As a Hobby Game Engine/GFX developer, I developed this technique with some tweaks: static geometry would just be rendered every few frames, but characters, grass or particles get permanently rendered. With the depth sort and extended viewport, it feels like native rendered and one can really aim precisely on a target, since this is always up to date. As mentioned in the video, dlss uses motion vectors, but has to guess the motion and static geometry. With proper implementation, this guess is not required, but can be calculated by the same hardware as the AI
@j1000a
Жыл бұрын
Does this end up looking like motion blur?
@Rhedox1
Жыл бұрын
What happens if rendering your static geometry takes 20ms on the GPU? How do you schedule the reprojection to ensure it's executed in time? Also, which graphics API did you use to implement this?
@Winston-1984
Жыл бұрын
What I'm wondering is, does the GPU in any way know what it wouldn't need to render, sections of the screen that can persist using this tech & only re-rendering additional frames for the sections that require more updates? This make sense?, its hard to put down.
@LadyRiderc
Жыл бұрын
I want it, and i want it now
@alexmathewelt7923
Жыл бұрын
@@Winston-1984 that's what I'm currently working on, since this is now a common technique for ray tracers. Currently I'm trying to create the formulas I need and proof them for small movements. But with this fixed splitting it works for first person shooter or smth like that with much static geometry. Static geometry is nowadays really fast to render.
@kingsizemedal
Жыл бұрын
2KP is such an amazing channel, he always has very interesting, out of the box ideas, and I love to see more of his wacky stuff being picked up!
@ruix
Жыл бұрын
Glad seeing Philip getting bigger every day. He's amazing
@sound123mine
Жыл бұрын
A LTT vídeo at 60fps?! My god, the little animations they put like the outro card look so good 👍
@MrPaxio
Жыл бұрын
yeah doesnt the dummy know 60fps is better than 8k uploads
@txsurvivalandcreations
Жыл бұрын
Really cool stuff. When I’m in VR and the frames drop during loading or something, it does exactly what you showed in the 10fps demo. You can see the abyss behind the projected image on the edges, with the location of the image updating to return right in front of you with each new frame. I had no idea that that’s what it was for.
@DiamondDepthYT
Жыл бұрын
ive been using vr for 3 years now- and I had no idea what it was until today either! Super cool to learn more about that stuff
@MrScorpianwarrior
Жыл бұрын
Oooohhh. You're absolutely right and not once did that occur to me! Imagine if that didn't happen and everywhere you looked was the same loading screen...
@merryjerry69
Жыл бұрын
@@MrScorpianwarrior how to get motion sickness lol.
@gamebuster800
Жыл бұрын
Cloud gaming would be great with this! You handle the reprojection locally and use the delayed frames as a source. It will basically eliminate the input lag.
@thewhywhywhy4302
Жыл бұрын
This is a sick idea
@MrMoon-hy6pn
Жыл бұрын
You would still need to sent a depth buffer and probably other information to the computer playing the game. So that means more load on the internet connection, but it still sounds interesting.
@khhnator
Жыл бұрын
not that will not do anything, but it will do less than what you think. even if it works, and I'm not sure it does, at least not in this form, because the time till you receive a frame that will fill the stretched gaps you just created by the camera around is so much higher, compared to a computer which will just fill that gap in the next 34ms if you are running at 30fps. then you might have a super smooth camera turning, but the time to shoot, jump, etc will still be the same. heck because of the higher disparity between camera and everything else. it might even be a worsen the experience instead of making it better.
@gamebuster800
Жыл бұрын
@@khhnator You're right, but the latency for cloud gaming is already not that high. The most noticeable effect at latency is moving the mouse to look around.
@fayenotfaye
Жыл бұрын
This is already done with VR cloud gaming services, when you use oculus air link, you’re basically doing the same thing but over LAN. If you drop a frame, you can still move your head around and it’s perfectly playable all the way down to 30 fps for most games.
@KeyT3ch
Жыл бұрын
This technology on handhelds will be ABSOLUTELY a game changer, not only it "look" better, it will be even more difficult to spot artifacts on a way smaller screen.
@aTron0018
Жыл бұрын
We need this integrated into Steam Deck OS!
@Neoxon619
Жыл бұрын
This actually reminds me of the input delay reduction setting that Capcom added for Street Fighter 6. The game itself still runs at 60fps, but the refresh rate is 120Hz for the sake of decreasing input latency.
@sm7085
Жыл бұрын
Good point. That's one of the added benefits of a high refresh rate monitor. Even though you might not reach a high fps, having a high refresh rate monitor can still benefit you from a reduced input latency.
@LaughingMan44
Жыл бұрын
that's not how any of this works...
@MelissaMeantIt
Жыл бұрын
this tech won't really be useful for fighting games specifically, and I think it would be more counterproductive tbh.
@MrNicePotato
Жыл бұрын
What does that even mean... The async shown, as I understand it, is essentially shifting your point of view before the GPU producing a new frame. But for a fighting game, it would have to make the new frame no matter what to show your input turning into a move.
@j1000a
Жыл бұрын
The *effect* (not reality, which is a bit different) also reminds me a little bit of QuakeWorld (and to a lesser extent, Quake and Doom). Even when the framerate is high the models use low-FPS animations, and with QuakeWorld I seem to recall objects in motion skipping frames based on your network settings. Meanwhile the movement was still buttery.
@c6m
Жыл бұрын
Whoa 2kliksphilip getting a shoutout on LTT before any of his brothers, imagine.
@joshuacook2
Жыл бұрын
Your demo should really have included any animated object. That would have shown some serious limitations and also would be present in nearly all games.
@sandmaster4444
Жыл бұрын
Seriously though.
@MrPhillian
Жыл бұрын
I keep seeing this concern, and while it might be true at low FPS, I don’t think that’s really where it would be aimed. I’d imagine most would still aim for 60+ rendering. Keep in mind that VR headsets are already using this method and are they experiencing these problems? I legitimately don’t know.
@matsv201
Жыл бұрын
It proboly wouldn't have been as obvious as you might think. At least not in 30fps. The reason why 60 or even 120fps is so obviously faster have to do with light retention in the eye when you move the mouse. But you can't se that with animation. Have you ever been Iin the cinema.... 24fps... yes.. 24.... do you think cinema is chopy? I-max+ have 48 fps
@amysteriousviewer3772
Жыл бұрын
@@matsv201 Movies don’t look or feel choppy because there is natural motion blur to everything and you also don’t control anything in them. It’s an apples to oranges comparison. Also IMAX doesn’t “have” 48 fps, IMAX is simply a format. A movie has to be shot at that framerate to display in that framerate. If it’s shot at 24 then it will be 24 in IMAX.
@TOGSolid
Жыл бұрын
@@MrPhillian Depends on the game and what sort of post processing is going on in the game from my experience. It can work well but if there's a bunch of fancy effects going on it can be very noticable the smoothness is being faked.
@RocketSlug
Жыл бұрын
Having just started getting into VR, I just recently learned about what asynchronous reprojection. Really cool to see it getting mentioned, because when I heard about it it seemed like what DLSS 3 wanted to do, only it's been here already for quite some time. Your description about how it decouples player input with the rendering makes me think of rollback netcode for fighting games and how that also decouples player input and game logic, and I'm really excited for what that means for the player experience
@zach99999
Жыл бұрын
This is really cool! I play shooter games a lot and the most annoying thing about low fps in games is the input lag. Slow visual information is more of an annoyance as long as it's above 30, but the slow input response times at anything below 60 fps drives me insane.
@ValenteXD
Жыл бұрын
I remember watching 2klik's video last month and saying wow this is amazing and mind blowing, but thought I was just excited for it because I'm a programmer, guess not
@harrasika
Жыл бұрын
I was also excited for it but thought nothing would come of it since I've only ever seen him talk about it. Now perhaps there's a chance of this actually becoming popular and coming into games.
@Adam-em1mf
Жыл бұрын
I knew about this because of my oculus rift, and as you mentioned, in racing games, asynchronous spacewarp (as oculus calls it) is quite noticeable, moving your head around while driving at 100 mph can be quite jarring but oculus updated the feature and the visual bugs weren't as noticeable, it's quite interesting to see how this works, excellent video guys
@grumbel45
Жыл бұрын
With the latest "Application Spacewarp" on Quest2 the games can now send motion vectors, so the extra-polated frames no longer have to rely on so much guesswork.
@Dimondminer11
Жыл бұрын
Yeah the visual artifacts can actually cause MORE issues in VR than not, at least in some very specific games. Its not super noticeable in VRChat but in the Vivecraft mod for minecraft the screen turns into a wavy, smeary mess. I actually hated that WORSE than running at 40fps natively which is what my system could do with my settings at that time
@215Days
Жыл бұрын
I can imagine this being used on old games where it has a forced framerate, be it 30, 60 or even a weird number like 25 (Nintendo 64 emulation could count). Just think about it, you can play Red Alert 2 at 30 FPS but it feels like it's at 60+ FPS instead, it will be amazing!
@Yalden_
Жыл бұрын
Holy shit, Philip MADE IT
@lightphobe
Жыл бұрын
I'm curious if you'd still get the same results if you add moving objects into the scene. Since the objects update their position at the true frame rate, I bet they would look super choppy.
@NichtDu
Жыл бұрын
Yeah youre right If you maybe played the teardown lidar mod this is kinda the same (atleast the mod has the same downsides as this) Honestly tho it cant get any choppier because the framerate stays the same. The examples were with really low framerates but if you had your normal 120 fps and a 360 hz monitor this technoglogie makes a big difference
@R3BootYourMind
Жыл бұрын
Some games already separate physics fps from rendering fps.
@DMitsukirules
Жыл бұрын
The thing is, you always want consistent input no matter what. The alternative in your scenario is the objects are still choppy, but your mouse movement is also sluggish
@timmbruce99
Жыл бұрын
I'm actually surprise LTT got a demo version without the moving objects (or at least not showing it). Cus yes, animation still looks choppy according to the fps cap, but the perception of moving the camera is smoother than butter 0-0
@comradestinger
Жыл бұрын
The latest version of the demo has moving objects, so you can see for yourself. (they look laggy, moving at the true framerate, as expected) x)
@Tayfaan
Жыл бұрын
I'm aware there's a small chance AAA game studios would implement this. I'm not a software wizard, but I was wondering if it is somehow possible to have the game run in some sort of emulator while the emulator handles Async Reprojection. From my understanding, the whole technology is based on what gets outputted by the GPU only. Would this be possible so we don't have to rely on developers implementing this? 3rd party open-source wizards would be the first to implement this then.
@elin4364
Жыл бұрын
Something worth noting is that comrade stingers demo does not really do what they say it does (mostly due to issues with how unity is made being pretty incompatible with this sort of demo). The gpu draws the entire frame during the last frame, so work is NOT split up over several frames. Doing that would be a pretty complicated task in existing game engines like unity.
@comradestinger
Жыл бұрын
This! The demo only distorts the *simulated* bad framerate from the slider. If you ran the demo with an actual bad framerate, it would just lag like normal. To actually implement it properly is much harder than what I did, in unity's case might require some severe shenanigans, or straight up engine modification.
@TrackmaniaKaiser
Жыл бұрын
@@comradestinger Do you think that something like that could be a driver feature like the DLSS2 Stuff. Where the GPU gets some motion vectors and shifts existing objects more or less like sprites arround until a new real frame got created?
@comradestinger
Жыл бұрын
@@TrackmaniaKaiser I think both could work, though I lean towards it being done by the devs themeslves rather than by driver. Since games vary so much, different scenes and camera modes would benefit/suffer from the effect in different ways. to be honest It's all very complicated.
@devzozo
Жыл бұрын
Wonder if using DOTS and scriptable render pipeline would allow for it, can't imagine figuring all that out in an evening though. I wouldn't trust a solution that leverages Unity's undersupported APIs to be that stable though...
@leongao5120
Жыл бұрын
@@comradestinger Good work man
@waybove
Жыл бұрын
I think we just witnessed one of those rare moments when an elegant solution clicks and starts a revolution
@kazioo2
Жыл бұрын
This video is poorly researched. Timewarp was invented by John Carmack in 2012 and described in his post "Latency Mitigation Strategies", more than 10 years ago, in the early 2012. In his original article games other than VR were already mentioned. I remember seeing normal desktop demos many years ago, but it never gained traction despite that.
@SaucedTech
Жыл бұрын
I love how much labs has instantly matured this channel. I have watched LTT for a long time but recently its really boosted its level.
@TikoyTV
Жыл бұрын
But… they are not even done?!
@HonoredMule
Жыл бұрын
It's been far from instantaneous, but we are starting to see the returns and it is definitely nice.
@50REN
Жыл бұрын
I have said for a very long time that when it comes to refresh rate, I don't mind lower frame rates from a visual standpoint, but the input delay is more what I love about high refresh rate gaming. I'm excited to see where this technology goes.
@deadtake2664
Жыл бұрын
I saw Philips original video and was really hoping other youtubers would spread the word
@ToasterTom
Жыл бұрын
I stumbled upon 2kliksphilip’s channels when I was researching how to make maps in Hammer. So glad you guys have mentioned him in multiple videos now!
@davidbakersound
Жыл бұрын
I’ve never understood why this hasn’t been done before. I’ve thought it should be done since 2016 when I got my VR headset. Like you said, extremely obvious!
@user-hk3ej4hk7m
Жыл бұрын
The main issue with these workarounds is that they depend on the Z buffer, they break down pretty quickly whenever you have objects superimposed like something behind glass, volumetric effects or screen space effects
@lucky-segfault
Жыл бұрын
Ya, that sounds like it could be a big issue...
@DavidGoodman
Жыл бұрын
You technically only need the depth buffer for positional reprojection (eg. stepping side-to-side). Rotational reprojection (eg. turning your head while standing still) can be done just fine without depth, and this is how most VR reprojection works already, as well as electronic image stabilization features in phone cameras (they reproject the image to render it from a more steady perspective). It might sound like a major compromise but try doing both motions, and you'll notice that your perspective changes a lot more from the rotational movement than the positional one, which is why rotational reprojection is much more important (although having both is ideal).
@shawn2780
Жыл бұрын
The issue with asynchronous reprojection is that with complex scenes or fast action it creates visible artifacts and weirdness. This is where AI comes in, like DLSS 3 frame generation. By using deep learning, it can more accurately and realisticly insert additional frames. That's really the way I'd the future along with ai upscaling. It has to be, otherwise we're going to need a nuclear reactor to power the future rtx 7090 or whatever.
@miikahweb
Жыл бұрын
Even bigger issue is that in VR games (where the camera can just move through walls if you move your head in the wrong place) the only thing async timewarp has to do is take the latest headset position and reproject there. However, in a regular game you can't just take keyboard/controller input and reproject to a new position based on that. Or your character would go through the floor, walls or obstacles in the map. Instead you would have to run full collision detections and physics simulations in order to tell where the camera is supposed to be in the reprojected frame. This not only makes it massively harder to implement compared to VR where it can be automatic. But it also increases the chance of hitting a CPU bottleneck and not gaining that much performance anyway. Then when you combine visual artifacts and other issues you start seeing why game developers haven't used their development time to implement this before.
@meowmix705
Жыл бұрын
Oculus/Meta is already doing that with their 3rd generation reprojection tech - Application Space Warp (AppSW). It meshes Reprojection tech with game engine motion vector data w/ a splash of Machine Learning to generate the best looking VR Reprojection yet. The recently released Among Us VR on the Quest2 is a good example of a game using the latest AppSW techniques. All thx to John Carmack, the grand daddy of Asynchronous Time Warp (ATW; the first mainstream Asynchronous Reprojection)
@shawn2780
Жыл бұрын
@@meowmix705 I don't play enough quest 2 games to be able to speak on AppSW (I play pc through link, and admittedly almost all of always disable regular ASW because of the inherent visual ghosting). Carmack may be the GOAT... I also get a kick out of the fact that he's probably the most reluctant Meta employee ever, but he sticks around because he just loves VR that much
@meowmix705
Жыл бұрын
@@shawn2780 Yup, Carmack is the Goat. As to PC-ASW, the Rift mostly uses 1.0 of ASW (has the typical ghosting visual artifacts). ASW 2.0 is only used for a handful of games (greatly diminished the ghosting, but required depth data from the game; many games did not support it). ASW 3.0 is unfortunately a Quest exclusive (for now?), they've rebranded it as AppSW to not confuse it with PC-ASW.
@hwstar9416
Жыл бұрын
Wait but then wouldn't it look horrible if something other than your character is moving on screen?
@Czllvm
Жыл бұрын
THIS IS INSANE, I use this already on Assetto Corsa in VR so I play at 120hz but it renders 60fps, Such a light bulb moment at the start, Really wish this can catch on because I've already seen first hand how great this is
@klayplayz
Жыл бұрын
Linus: It's for free Also linus: Makes no difference
@SelecaoOfMidas
Жыл бұрын
Your point is?
@shib5267
Жыл бұрын
@@SelecaoOfMidas Makes no difference
@OrRaino
Жыл бұрын
It can make game smoother for low end pc gamers. The pc market will discourage it because then the high end gpu's will become relevant and only medium tier gpus are enough if the tech is further develop for pc gaming.
@justinkruger
Жыл бұрын
If Async Timewarp becomes popular on desktop, I wonder if it makes sense to spend the extra time render the z-frames behind some of the foreground layers, or special models that are say 'tagged' or earmarked to have render behind turned on. So for example you might want to render behind a fast-moving player model, or a narrow light pole, but you won't want to render behind a wall. Or maybe you only need to render behind a small pixel distance. I'm not sure how easy that is to add to game engines?
@thenoddingturtle
Жыл бұрын
I probably wouldn't rely on it for a higher FPS, but it looks like it has a great potential to smooth out FPS dips (say I'm playing at 60 that can sometimes drop to 50ish when a lot is going on).
@SixtenAlin
Жыл бұрын
This would be perfect for high resolution gaming. Because if you want you could play at the highest settings + 4k at a smoother feel.
@whitecanid8938
Жыл бұрын
It's time to have a native implementation of this on Steam Deck. Shout out to the Valve guys, please LTT make it happen!
@JoshsBookishVoyage
Жыл бұрын
This is interesting. I'd love to see this feature compared to actual higher fps to see whether the perceived gain renders an actual competitive advantage.
@chrisc1140
Жыл бұрын
One thing I wondered about when I first saw that video is if the PERCIEVED improvement is good enough that you could lose a couple more frames in exchange for rendering a bit further outside the actual fov, but at a really low resolution. Basically like a really wide foveated rendering. It would mean the warp would have a little more wiggle room before things started having to stretch.
@Juice8767
Жыл бұрын
I need this on my Steam deck NOW! That would be amazing for hitting 60fps on everything while minimizing battery use
@ninjason57
Жыл бұрын
Yes!
@saart2212
Жыл бұрын
Now that's a public interest video ! Raising awareness on this technique will certainly go a long way, especially in open source. I hope the constructors don't shy away from it from fear that it would diminish interest on their high-end GPUs.
@dnitz9608
Жыл бұрын
High-end? What r u talking abt, they can make AAA game 8k 240fps without 6slot GPU
@jonasmostert3294
Жыл бұрын
I think one caveat here, which has not been mentioned, is that dynamic objects in the focus/center of the screen will also only be updated by whatever frame rate your GPU allows. I wonder how to handle those scenarios. Still a very worthwhile improvement for a lot of games, for sure!
@99domini99
Жыл бұрын
From my experience with VR, although while looking around is perfectly smooth, animated characters on the screen update slower.. But that really isn’t much of a problem. There is no input lag from your HMD, you can look around perfectly fine and the 45fps the headset makes when reprojecting is still smooth enough to track targets. The only real caveat is the input lag from your controllers. Moving your hands will feel less responsive when reprojecting than when running native framerate. I wonder how this will carry over in desktop reprojection.
@pegaferno4429
Жыл бұрын
That 30 FPS looking 240 looks freeeeky man
@jrmaty
Жыл бұрын
Jumping in here - lead tech artist, currently working in VR. I want to clarify - your camera/view may feel smooth to your input, because it is reprojecting at the input rate of your input device for your camera. It will, however, _ONLY_ take the other inputs at the target framerate. For example, you will not get the same response time for shooting a gun in an FPS using this technique, at say 30 FPS, as you would with a native framerate at 60 FPS that wasn't using reprojection. In response-time-sensitive scenarios, you absolutely do not want this over a fast native framerate. It has uses where input delay isn't so important (i.e. this is good for Pokemon, bad for CS:GO) It also cannot work with transparent objects using a traditional render pipeline, as these do not write to motion vector frame buffers. This is a problem for subtitles, GUI, hair, water/glass, particle systems, etc.
@DMitsukirules
Жыл бұрын
1 is an architectural problem, not a fundamental one. Two however, is fundamental given the rendering styles
@mycosys
Жыл бұрын
Thanks - this is exactly what i figured the issue would be. (mechatronic engineer)
@jrmaty
Жыл бұрын
@@DMitsukirules How do you mean "architectural problem"? To solve this issue you would need a completely novel and different approach. The 'first problem' is fundamental to how the technique works. It works by rendering a separate buffer that stores the motion of all the pixels. Combining this with the camera motion and depth buffer, in the next frame, we can reproject the pixels to where they "should" be in the next frame. It does this without doing any of the render pipeline work in that next frame. The CPU does no rendering work (i.e. occlusion culling, light culling & sorting, frustum culling). We can't, therefore, also add in the motion of an animation, the muzzle flash of a gun being fired, a light being enabled whilst it is currently reprojecting the previous frame - neither the CPU nor renderer has this information, and not having that information is the point - it is skipping updates in gameplay in order to smooth out the camera movement. Whilst it reprojects a frame, no CPU work is happening, it is _literally_ not receiving input.
@Chrisspru
Жыл бұрын
the feeling alone reduces the brains stress, meaning it can work more meaningfully with the actual frames. the response of the player to the frame given chances increase.
@biolinkstudios
Жыл бұрын
but fixable if money was pumped in to the tech
@Fearagen
Жыл бұрын
This is an actual game changer, forget about upgrading your GPU just run this. It would be cool if all the current AAA titles released in the last 10 years for example would adopt this. Something like destiny 2 which I was not able to run properly until I upgraded my PC.
@antivanti
Жыл бұрын
I totally expected this when it was first implemented for the Oculus Rift... Really surprised that it has taken this long to even start being prototyped for flat games
@izzieb
Жыл бұрын
Yeah, it's brilliant for heavy games in VR (looking at you MSFS!).
@samudec5134
Жыл бұрын
It's huge for low/mid range setups to make games more responsive but it's also nice for high end machines because you'd completely negate the impact of 1%frames and feel like you're always at your average
@backslash153
Жыл бұрын
Given that one of DLSS Frame Generation's biggest issues is responsiveness, this seems like a perfect aid in improving DLSS.
@jamiekerr5514
Жыл бұрын
Wow, not sure who wrote this one but such a good explanation. So clear and well presented, good job!
@F-aber
Жыл бұрын
i honestly think this might be really great for ultra and super wide gaming as even more of th rendered frame is already in your peripharal vision, so the suboptimal edges will be even less noticible
@Imevul
Жыл бұрын
As a dude with a 32:9 monitor, I'd settle for games actually supporting my monitor resolution AND aspect ratio. Most games I have to play windowed mode, because even if they allow me to go full screen (and don't add any black bars), usually the camera zoom is completely fucked and/or the UI elements are not properly positioned. Heck, even a newer game such as Elden Ring just give me two 25% black bars on each side, but with mods it actually supports 32:9. Wouldn't have been that much work to support it by default. A checkbox for disabling black bars and vignette, and an option to push the UI elements out to the sides and all would be fine. But for some reason, most developers don't even care to support newer monitors with weird resolutions.
@huttonberries768
Жыл бұрын
@@Imevul it did support it by default but fromsoftware disabled it intentionally because of "competitive advantage" or some bullshit
@F-aber
Жыл бұрын
@@Imevul I also run with a 32:9 display and feel you, but don't be surprised with elden ring from soft does not care about proper PC support. About the UI thing, I actually can't think of a game off the top of my head that supports 32:9 without also having the option to adjust UI elements as in my experience it's very common and has always been a thing even consoles 10 years ago
@odinsplaygrounds
Жыл бұрын
Could this render out maybe 10% higher resolution than your display, and basically crop the edges off so you can't see those weird artifacts? Basically would be more or less "flawless" experience then?
@happysmash27
Жыл бұрын
Asynchronous projection is great in VRChat (in PCVR on my relatively old PC) where I usually get 15 fps and often even far less than that! I don't really mind the black borders that much in that case especially since the view usually tends to go a bit farther than my FOV making them usually only appear if things are going _extremely_ slow, like, a stutter or any other time I'm getting over 0.25 seconds per frame. So, perhaps another way to make the black bars less obvious, would be to simply increase the FOV of the rendered frames a little bit so that there is more margin. Would make lower frame rates, but it might be worth it in any case where the frame rates would be terrible anyways.
@SirDragonClaw
Жыл бұрын
I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project. I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly.
@theonik2006
Жыл бұрын
Another part of the original solution's brilliance that you lose on this that the VR solution exploits, is the fact that the edges of the frame in VR, tend to suffer from some kind of distortion from the lens to begin with. Especially back when John Carmack came up with it in Oculus. It was perfect for that. It's still really interesting on a flat panel but I think the fact some people might notice it is part of why it never really go that far.
@axodox
Жыл бұрын
I actually was thinking about writing an injector to apply this to existing games a few years ago when I have seen the effect on the HoloLens. A few limitations though: camera movement with a static scene can look near perfect, however if an animated object moves depth reprojection cannot fix it properly, and you would need motion vectors to guess where objects will go, but that will cause artifact near object edges.
@meneermankepoot
Жыл бұрын
6:59 man that reminded me of the backwards talking guy featured on smarter every day
Пікірлер: 3 М.