ToddRivers

  • 2017년 6월 20일
  • 2016년 2월 14일에 가입함
  • Hey everyone!
    Been a while. First cheers for all the thanks! I'm glad the shaders have been useful 🙂
    I learnt to code by poking around other peoples open source code so feel it's good to give some back 😉

    @NeatWolf thanks so much! You're game looks lovely, will have to try it! Did you play/see the Triennale Game Collection? That was a pretty cool collection of indie games put together by the art design museum in Milan.
    I've added your emission fix as well 🙂

    I stuck them on the asset store as well to try and help other peops find them.
    https://www.assetstore.unity3d.com/en/#!/content/88191

  • @decoamorim Hey sorry man the latest shaders only support Unity 5.6 unfortunately. I don't have time to backwards support old version of Unity - guess if you can't upgrade to the latest version of Unity then you'll have to keep using an old version of the shaders.

  • Ah! Doh! I miscounted and my paths have four parent folders doh! Haha typical 😛

    I can understand wanting to limit path length but could you not instead of having the 3 parent folder limit, check for it being on the same drive instead? I've tried to restructure my project a bit but more flexibility is always welcome 🙂
    (I realise I've probably just got a unusual project structure!)

  • Hey! Ah, nice I didn't realise that!

    I guess what would be really nice however is when you click browse and find your image folder it then converts that path to being realitve (if possible), rather than having to manually typing it out, only a thought though!

    Yeah sorry although my projects are on different drives on my two computers, the images folder is always on the same drive as the spine animations (ie on PC everythiong is in [url=C://MyGameProject]C://MyGameProject[/url] and another is in [url=D://MyGameProject]D://MyGameProject[/url]). Having realitve paths referencing different drives does indeed sound crazy!

  • Shader에서

    Yeah see what he says, I've certainly had the shaders working fine on android without spine - just applied to a quad with no flickering or anything.
    Try making a brand new material with the latest shaders - sometimes materials get in a weird state with their settings etc.
    Also make sure the flickering isn't caused by Z fighting - most 2d games you don't need to write to the depth buffer and instead use layers to sort sprite ordering.

  • Shader에서

    Hey!
    What's your actual issue? If you're having trouble getting the shaders working on mobile I recommend pinging a message to NeatWolf, he's using them on android and iPhone and they're working for him so might be something you're missing or a weird material problem?

    I've tried them on android briefly and they work for me, I unfortunately don't have the resources to give full-time support for the shaders as I'm a poor indie dev haha. I do my best though!

    I recommend trying to run a new, empty unity project with the latest shaders applied to a quad and seeing if those work and start slowly recreating whats in your game to work out what actually doesn't work - pin pointing the problem is the first step of solving it. But yeah maybe NeatWolf can give some pointers? He has a lot more experiance using these shaders on mobile than me 🙂

  • Hey!
    Is there any way of making .spine files store the path to there Images folder relative to itself instead of an absolute path?

    I work half on my laptop and half my PC at home and my project is stored on a different drive on both.
    This means when I open a Spine file I'm constantly having to change the Images path from C:// to D:// and back again. If it was store relatively then it would work fine 🙂

    If it's not currently possible this a feature that could get added? I think its a pretty common problem and can't think of a downside..

  • Hey!
    Heh you can if you want, my real name is Tom Raggett - Todd Rivers is a character from one of my favourite tv shows. (Which I always recommend to people! - http://www.imdb.com/title/tt0397150/)

    Hopefully it does finally sort out those issues, I tested it on android and pc and think the theory is sound, but yeah other platforms are as yet untested 🙂

  • Hey y'all! Been a while.
    So I just realized I made a silly mistake in the shader fixed normal code which meant lighting would be incorrect on certain platforms. I've checked in a fix but it means from now on the fixed view normal should be defined as (0,0,1).
    It should however now work consistently on all platforms.
    Let me know if you see any bugs with it!

    @AlaaNintendo I've had some Z fighting issues on 5.6, for some reason the depth buffer seems more compressed. Not entirely sure why yet but increasing the Z Spacing on your Skeleton Animation component should stop the Z fighting.

  • Ok good news! (I think)

    So having done a little bit of research I think Unity always uses a left handed coordinate system for its camera matrices EVEN when the Z buffer is reversed.
    This means the camera always looks down the negative z axis, even when UNITY_REVERSED_Z is defined.

    SO!! I am now pretty sure I can remove the code for flipping the fixed normal when UNITY_REVERSED_Z is defined.
    Basically meaning you can define the fixed normal as (0,0,1) on all platforms. I'm going to sanity check it but I'm pretty it all works.

  • Yeah sorry I was wrong about not needing to reverse the Z direction. At least I think so.
    In order to properly know what's going on inside Unity I would need access to different platforms to and see what Unity defines on each, and whether or not Z reversing is happening.
    That's a lot of work and I'm just a struggling indie who made these sharers on the side - I don't have the time or resources to be able to do that unfortunately.

    If you're really struggling you could get one of your coders to set a shader global variable for the normal direction, set it in code to be (0,0,1) or (0,0,-1) based on platform (in C# you can get a lot more platform information than in shaders) and then modify the shaders to use the global instead of a per material value like they do at the mo.

  • Hmm the simplest hacky workaround would be to put platform defines around the code that flips the fixed normal, like this:

    inline float3 getFixedNormal()
    {
       float3 normal = _FixedNormal.xyz;
    #if UNITY_REVERSED_Z
       normal.z = -normal.z;
    #endif
    #if defined(SHADER_API_MOBILE)
       normal.z = -normal.z;
    #endif
       return normal;
    }

    But!! if your saying there's a difference between iPhone 4 and 5 that won't work as there is no way of the shader knowing what version of iPhone it is.

    I'm pretty sure this is Unity's fault not mine or esoterics. To explain the situation, since 5.5 Unity reverses the Z direction on certain platforms as it gives more precise depth values - see 'The Depth (Z) direction in Shaders' on this page.
    https://docs.unity3d.com/Manual/SL-PlatformDifferences.html
    When they do that UNITY_REVERSED_Z should be defined, and when they do I flip the fixed normal.

    If a platform isn't working it's because it is reversing the z direction and not defining UNITY_REVERSED_Z or it has defined it but isn't reversing z.
    Either way its out of our control as its all internal to Unity.

  • Hey man

    So I haven't looked at the shaders for a while but now I look at them again with fresh eyes I think I was confused about the fixed normal's stuff when I wrote it originally.

    I was using the UNITY_REVERSED_Z define to flip the normal but that define is only used for reading from the z buffer, which is back to front on certain platforms (so it flips on like PC but wont on android/iPhone).

    Instead I think the fixed normal should always be (0,0,1). The camera in unity always points down negative Z axis locally no matter what platform so to face into the camera normal's need to be positive Z or yeah, (0,0,1).

    I can throw over some shaders for you to test, which hopefully work with setting the fixed normal to (0,0,1) on all platforms.

  • Hey! Added a little update which I think was requested a while ago - scaling for normal maps (I noticed Unity added it into their standard shader so just copied that code) that means you can tweak how bumpy your bump map is in the shader 🙂

    As far as my normal map flow I use SpriteIlluminator to create my normal maps. I'm still paying around with the best way of solving 'pillowing' where joints or limbs that should be one continous shape look like seperately beveled parts.

    One method is instead of using SpriteIlluminator to bump each sprite image seperately, export your full character or full limb from photoshop, bump that in SpriteIlluminator and then reimport it into photoshop and then cut it up and export it in the same way you cut up the character for animations.
    This means the parts that should be one piece like a leg for example get bumped as they should.
    Sure there's better methods though!

  • Skype에서

    Hey man!

    Yeah not too bad. I'm not on skype sorry, I'm sposed to join twitter soon as pretty much all the indie devs I know live on it! I'll give you a shout if I join that 🙂

    Cheers,
    Tom

  • Hey AlaNintendo!

    I have noticed spikes in the editor when you turn on/off features on a material for the first time. I haven't experienced them at run time though - looking a the profiler it looks like its the editor compiling a new version of the shader when defines get enabled / disabled. Thats taking 1131ms for me which is a bit scary!

     Loading Image

    I suspect I've either done something stupid or gone over the shader keyword limit or something.
    Once the shader is compiled there should be no more spikes though (so shouldn't effect performance apart from in the editor)
    Is your spike happening whilst playing? It strange it looks like its calling shader.editorloadvariant - the way I thought Unity worked was it makes a compiled version of your shader beforehand.... maybe i need to do something to make unity spit out variants properly when it builds:
    https://blogs.unity3d.com/2014/05/06/shader-compilation-in-unity-4-5/

  • Cool, good luck! If you want to use it for a Post Effect that just uses the Depth Buffer (Like I think Unity's built SSAO now does) then I've added a shader that renders a custom depth buffer for a camera with nice soft edges for sprites that don't normally write to depth:
    https://github.com/traggett/UnitySpriteShaders/blob/master/SpriteShaders/SpriteDepthTexture.shader:

    However aswell as changing references to _CameraDepthTexture in the post effect shader to be your own depth texture you will also need to use DecodeFloatRGB() instead of SAMPLE_DEPTH_TEXTURE() to get a depth value from the texture (SAMPLE_DEPTH_TEXTURE uses 32bit RGBA for depth, we have to use just 24bit RGB because we're using the alpha channel to blend the soft edges).
    DecodeFloatRGB() is inside ShaderMaths.cginc.

  • Hey everyone, happy new year!

    I've, just checked in some updates to how fixed normals work - you can now specify them either in camera space or model space and adjusting for lighting back faces now works correctly (there were a couple of bugs with it).
    However correct back face lighting now requires mesh normals (its the only way the shader can know what direction a vertex is facing in).
    The shader does now remind you what it wants in terms of tangents or normals though 🙂
    If you grab latest you might need to retick 'use fixed normals' on your materials as the defines have changed, sorry!

    The majority of the time with Spine animations you don't need to adjust the tangents for correct back face lighting - it's needed only when you have rotated your object to face away from the camera and thus the tangents need to be flipped in the shader.
    I recommend instead of rotating sprites, use the Spine skeleton.FlipX/Y. This means your Spine animation won't need normals and things like Z offset for submeshes will stay correct.

    If you render Unity Sprites or Meshes then you might need to turn on the 'Adjust Backface Tangents' option if they face away from the camera.


    03 Jan 2017, 12:39


    @[삭제됨] yeah using a custom Depth+Normals buffer is pretty advanced to be fair.
    What you're doing there is correct - that should render the depth+normals for the scene including soft alpha'd sprites.
    However you want to render into a RenderTexture which you can then pass through to your Post Effects. (ie give the camera a target texture).

    You will also need to edit the PostEffect shaders you're using to use this newly created RenderTexture instead of the default camera Depth+Normals texture.
    Admittedly this is all pretty advanced stuff and you'll need to be able to edit your post effects shaders to get it to work but it's what I do for my game so def works 🙂

    I recommend reading this (plus the first 2 parts) which explains the CameraDepthNormals texture and how it gets used with a simple example.
    http://willychyr.com/2013/11/unity-shaders-depth-and-normal-textures-part-3/
    In this example he's telling his camera to render a depth+normals texture for him with the following line of code:

    camera.depthTextureMode = DepthTextureMode.DepthNormals;

    Then his post effect shader is using that texture with:

    _CameraDepthNormalsTexture

    Which inside a unity shader automatically grabs the last cameras generated DepthNormals texture.

    In your case you don't want to generate a texture or use _CameraDepthNormalsTexture as you've rendered your own special one. So instead of using _CameraDepthNormalsTexture in the shader, pass it the texture you rendered into with
    camera.RenderWithShader() and you should see the normals and depth for your Sprites.
    This can then be adapted for things like Depth of Field or SAAO.

  • @xcube That's because Volumetric lighting depends on the depth buffer, you can't write soft edges to the depth buffer, a pixel is either written at its depth or not written - fading doesn't really make sense physically.

    However!! I did manage to write a custom camera replacement shader that allows just that - writing soft edge to the depth buffer, its actually on my github:
    https://github.com/traggett/UnitySpriteShaders/blob/master/SpriteShaders/SpriteDepthNormalsTexture.shader
    Using that you can generate a depth+normals texture for a camera that allows you to use effects like Depth of Field or Volumetric Lighting without having to write to depth when rendering sprites (the sprites get written to the depth texture afterwards, with soft edged alpha).

    Anyways! I'm away for a bit but though I'd leave peops with a christmas present - I've added Specular maps to the shaders :sun:
    It was a hell of a lot more work than I'd imagined - I fell down the rabbit hole somewhat pulling apart how Unity's new shaders do their physics based specular effects but managed to get it workign with the sprite shaders.
    Specular effects are optional (so wont effect shaders performance unless you enable them) and are based on the Metallic workflow for shiny materials using a Metallic Gloss Map as explained by the first chart here:
    https://docs.unity3d.com/Manual/StandardShaderMaterialCharts.html