View Single Post
Old 06 January 2017, 01:42   #11
Registered User
rsn8887's Avatar
Join Date: Oct 2006
Location: USA
Posts: 771
Originally Posted by thellier View Post
Something like that should works (sorry dont know this shading langage) :

float3 screencorner=float3(1,-1,1);

screencorner = mul(screencorner,WorldViewProjection);

No it doesn't work. I think all those are somehow scaled coordinates between 0 and 1 or -1 and 1 or something.

After googling this for another hour, it seems set: the ONLY way to get the screen size is to pass it into the shader from the main program (winUAE). Some environments have preset variables for this. I read about glFragCoord and wpos, and in Unity it is called _ScreenParams, but here in WinUAE we don't have it (yet ). Apparently d3d has VPOS but it only gives the current pixel coordinate in screen space, not the total height and width of the screen in pixels.

AFAIK this MUST be set from the host application and is not available in the shader itself.

So in short what we need somehow is the host screen size. For example if the user opens up WinUAE in fullscreen mode with a screen res of 800*600 (call it output_size), even if the WinUAE Amiga screen is only 640*200 (call that input_size). Right now, we have access to input_size from within the shader. It is called INPUTDIMS, SOURCEDIMS and float2(1/TEXELSIZE.x,1/TEXELSIZE.y). But there is no OUTPUTDIMS, SCREENDIMS or anything like that.

BTW: Shader languages and models must be the WORST documented things ever.
Best I found it
EDIT: Also found this MSDN site now:
Spotty, but more useful. So yes all those constructs are scaled coordinates. The shader does not know about the screen_size.

Last edited by rsn8887; 06 January 2017 at 02:04.
rsn8887 is offline  
Page generated in 0.06835 seconds with 9 queries