So, I've been writing my own fur shader (because I have only seen one half-decent one around, and I don't have the money to buy it), and it's been going pretty well so far. Lighting is a bit dodgy, but only because I don't have Unity Pro (I could fix it with that, I think). I'm happy to use it as it is at the moment, except for one thing. This shader works by rendering a solid base layer, then gradually stepping out the vertices, and rendering 10 fur layers, stacking on top of each other to produce the hairs. The first layer is rendered with ZWrite On, as well as being tagged as Opaque. The fur layers, however, are rendered with ZWrite Off, and are tagged as Transparent (with the appropriate render queue). However, this is producing problems, as for some reason they seem to always render behind my trees' leaves. No matter what I do to the render queue, I can't seem to get this to work - instead, the only thing I can do is to turn on ZWrite, and this just culls out the leaves, which is almost as bad. I'm fairly new to fragment shaders (I've mostly worked with surface shaders before now), so if anyone could help me out here, I'd really appreciate it. Here's a screenshot of the effect:
![Fur displays correctly against opaque objects, but appears behind billboarded objects.][1]
I hope that someone can help me out!
[1]: /storage/temp/15904-fur+issue.png
↧
Vertex/Fragment shaders - transparency ignoring Render Queue
↧
Transparency render issue
I understand that this has been asked before but I've done *almost* everything to get it to work. This might be beyond the scope of my expertise because I barely messed with shaders which I'm saving for last in regards to learning Unity.
Here's the problem as I'm sure you've seen before:
![alt text][1]
So there are a lot of solutions out there but none that I understand.
I tried the custom shader solution but I have a feeling that I'm adding the shader code incorrectly. I create a new shader, open the compiled shader, and replace the existing code with the solution code, correct? Or is there somewhere I place the code within the shader code that Unity provides in the shader?
I also tried the 'creating the face at a distance from the main object as part of the object to force render priority' trick but that didn't work.
Then there's renderQueue. I really have no idea how to use that or if it solves this problem.
If any one of these answers is the correct solution, can you say step by step how to fix this seemingly minor problem rather than responding with vague external resources or solutions?
I know that sounds rude but I see this problem as kind of an annoying obstacle that's taking time away from what I should be doing. Sorry, I have really bad OCD and I'd rather not get into writing my own shaders just yet. :)
EDIT: They key is not rendering correctly which is spinning and some of the faces of the key are showing instead of the closer faces. I will need to fade this key out so that's why I'm using the transparent/specular shader. Here are some images...Unity's kind of being a pain right now so I'll need a minute.
![alt text][2]
[1]: /storage/temp/16973-untitled.png
[2]: /storage/temp/16975-untitled.png
Edit:
I did find something on this thread concerning Zwrite because the guy has the same problem but Zwrite doesn't seem to be in the shader.
http://forum.unity3d.com/threads/41007-Transparent-Diffusr-shader-problem-Done
Commented out Zwrite in the transparent/diffuse shader. Didn't work.
Then I tried this shader:
Shader "Transparent/VertexLit with Z" {
Properties {
_Color ("Main Color", Color) = (1,1,1,1)
_SpecColor ("Spec Color", Color) = (1,1,1,0)
_Emission ("Emissive Color", Color) = (0,0,0,0)
_Shininess ("Shininess", Range (0.1, 1)) = 0.7
_MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
}
SubShader {
Tags {"RenderType"="Transparent" "Queue"="Transparent"}
// Render into depth buffer only
Pass {
ColorMask 0
}
// Render normally
Pass {
ZWrite Off
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
Material {
Diffuse [_Color]
Ambient [_Color]
Shininess [_Shininess]
Specular [_SpecColor]
Emission [_Emission]
}
Lighting On
SetTexture [_MainTex] {
Combine texture * primary DOUBLE, texture * primary
}
}
}
}
This strangely made the object disappear altogether. Sigh.
↧
↧
Water rendered over other game object
Does anyone know how to fix this?
![alt text][1]
the camera's settings are:
![alt text][2]
The watermesh uses the "FX-Water" shader where, at line 23, i put
Subshader {
//Tags { "WaterMode"="Refractive" "RenderType"="Transparent" }
Tags { "Queue" = "Transparent" }
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
to make it transparent.
[1]: /storage/temp/17028-cattura.png
[2]: /storage/temp/17029-camera.png
↧
Two transparent objects passing through each other problem
What I want to do is this:
![alt text][1]
Two objects that one is passing through other while both are transparent. I have problems with object two, tried few shaders but object behaves strange under particular angles. Setting obj2 queue to Transparent-1 is not really what I want to achieve.
Is this even possible to achieve in some easy way?
PS. If obj2 is not transparent (Diffuse for instance) everything is working fine.
[1]: http://i.imgur.com/4PlLkP4.png
↧
NGUI Behind GameObject
I would like to render some NGUI sprites behind certain game objects. Changing the render queue on my Atlas's shader does not seem to have any effect. Is there somewhere else I should be looking? Any suggestions?
Thank you very much for your time and knowledge!
Kevin
↧
↧
Need help rendering beer glass in unity...
So I am making a simple game in which you control a beer glass sliding along a bar avoiding obstacles, and I am trying to come up with a way of rendering a realistic pint, preferably with refraction, bubbles, etc, but for now, I am just trying to use two transparent objects (the beer will eventually get a diffuse map applied too).
The problem at the moment is that I can't seem to control the render order of the objects so the beer always appears inside the glass, it only appears through the glass from a low angle, so it appears in the reflection, but not from the angle which the game will be played from. For the glass I am using one of the hard surface shaders from here: https://www.assetstore.unity3d.com/#/content/729
and for the beer I am using a default unity transparent material, although the issue also happens with any trans material.
Firstly, if someone can help me ensure this renders correctly from all angles, that would be great.
Second, if someone can come up with a better way to make a glass of beer in unity, that looks really, REALLY good, that would be so awesome you have no idea! ;P
Thanks for any and all help in advance!
-WBC
beer glass from top
![beer glass from top][1]
from top without wood
![from top without wood][2]
beer glass from below
![beer glass from below][3]
[1]: http://i.imgur.com/AmXynM7.png
[2]: http://i.imgur.com/NJ7Kise.png
[3]: http://i.imgur.com/8hxu0A7.png
↧
How to change terrain shader
Hi , I wanted to use depth mask on my terrain and in this tutorial :
[http://www.blog.radiator.debacle.us/2012/08/how-to-dig-holes-in-unity3d-terrains.html][1]
it says I have to change the render queue of my terrain by changing it's shader. I saw this thread : [http://forum.unity3d.com/threads/54742-How-to-override-the-terrain-shader][2]
and it says to override terrain's shader I have to write a shader with the same identification. But what identification I should use to replace the shader?
I'll appreciate if someone can help me with this.
thanks
[1]: http://www.blog.radiator.debacle.us/2012/08/how-to-dig-holes-in-unity3d-terrains.html
[2]: http://forum.unity3d.com/threads/54742-How-to-override-the-terrain-shader
↧
Transparency under depth mask
Hi everyone. I have been using a depth mask shader (like [this one][1]) for a while now, but now I am wondering if it is possible to render a transparent object under it, since the depth mask uses the render queue to determine what is masked and what isn't. Since the Transparent queue is under the mask, it gets masked out and never renders. Is it possible to render a transparent object on a higher depth of the render queue? Is it possible there's another solution to this problem? Thanks for your help!!
(For example, I have been using the depth mask to render randomly-placed openings in an otherwise solid wall mesh. Now I want to render something transparent outside that opening.)
[1]: http://wiki.unity3d.com/index.php?title=DepthMask
↧
Billboard trees render queue
Hello,
I have problem with billboards, yes I know that they have render queue transparent - 100 but the actual problem is, they are always wrong. If the renderqueue is lower (transparent - 110 for example) they seemed to be OK from one side, but now when looking through the transparent object, billboard trees appear over it, but when I set it higher, the object appear over it. I smell that there is something wrong with unity rendering, because this should have been handeled by default, without any need to change anything.
Any ideas?
**LEFT:** object renders over billboard trees **RIGHT:** trees render over object
![alt text][1]
[1]: /storage/temp/24874-image.jpg
↧
↧
Shadows have a mind of their own
I'm trying to make a shader that receives shadows and selectively outputs transparency based on the texture.
Even though my shader works, when I output transparent fragments Unity is still applying a translucent shadow in 3d space, and is using the geometry to occlude any background shadows based on the current viewing angle. This can be seen in the attached screenshot. Note the following:
- Even though the female figure is transparent (the floor and cube can be seen behind it), there is a shadow being applied to the model.
- The male character's shadow and the cube's shadow are occluded by her hip and hand, respectively.
![alt text][1]
Here's the subshader header. The contents of the shader don't matter because even if I change the shader to simply output fixed4(0,0,0,0) I still get this behavior. If I change the queue to Transparent then the behavior stops but causes SHADOW_ATTENUATION to always return 1, making my shader useless.
SubShader {
Tags {"Queue" = "Geometry" "RenderType" = "Opaque"}
Pass {
ZWrite Off
Blend SrcAlpha OneMinusSrcAlpha
Now, I can live with the shadow occlusion (even though I would like to understand why it happens) but the extra automatic pass to apply shadows to my model is something I would like to remove, as it feels utterly pointless. Can I indicate somehow to Unity that I am handling shadows myself and can you please leave my poor mesh alone?
Thanks in advance for any advice.
[1]: /storage/temp/26962-screen+shot+2014-05-27+at+11.53.58+am.png
↧
Strange Render Order Issue
I have two vertex/fragment shaders, both unlit, optimized for mobile. One is a transparent additive shader for particles (We'll call him "Particles"), and the other is a simple, bare texture mapping (We'll call him "Background"). "Particles" is set to a higher Queue (I've taken it as high as "Overlay+20000"), and "Background" is set to the Background queue.
The strangeness that I am seeing is that sometimes (not always), "Background" obscures "Particles", even though the geometry drawn by "Particles" is closer to the camera. I have a barebones scene that shows this perfectly - I have a background quad shaded by "Background", a sphere in front that also shaded by "Background" using a different material, and another quad in front of it all shaded by "Particles".
The sphere incorrectly obscures the foremost quad shaded by "Particles", but the background quad does not obscure the foremost quad.
![alt text][1]
Here is the code for the "Particles" shader:
Shader "Mobile/Particles/Additive (CG)" {
Properties {
_MainTex ("Particle Texture", 2D) = "white" {}
}
Category {
SubShader {
Tags { "Queue"="Overlay+20000" "IgnoreProjector"="True" "RenderType"="Transparent" }
Blend SrcAlpha One
AlphaTest Greater .01
ColorMask RGB
Cull Off
Lighting Off
ZWrite Off
Fog { Color (0,0,0,0) }
LOD 250
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
sampler2D _MainTex;
struct appdata_t {
float4 vertex : POSITION;
fixed4 color : COLOR;
float2 texcoord : TEXCOORD0;
};
struct v2f {
float4 vertex : POSITION;
fixed4 color : COLOR;
float2 texcoord : TEXCOORD0;
};
float4 _MainTex_ST;
v2f vert (appdata_t v)
{
v2f o;
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
o.color = v.color;
o.texcoord = TRANSFORM_TEX(v.texcoord,_MainTex);
return o;
}
fixed4 frag (v2f i) : COLOR
{
return i.color * tex2D(_MainTex, i.texcoord);
}
ENDCG
}
}
}
Fallback "Mobile/Particles/Additive"
}
----------
And here is the "Background" shader:
Shader "Mobile/Unlit/Background (CG)" {
Properties {
_MainTex ("Base (RGB)", 2D) = "white" {}
}
SubShader {
Tags { "Queue"="Background" "RenderType"="Opaque" }
Pass {
Tags {"Queue"="Background" "RenderType"="Opaque" }
Lighting Off
Fog { Color (0,0,0,0) }
ColorMask RGB
LOD 100
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata_t {
float4 vertex : POSITION;
float2 texcoord : TEXCOORD0;
};
struct v2f {
float4 vertex : SV_POSITION;
half2 texcoord : TEXCOORD0;
};
sampler2D _MainTex;
float4 _MainTex_ST;
v2f vert (appdata_t v)
{
v2f o;
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
o.texcoord = TRANSFORM_TEX(v.texcoord, _MainTex);
return o;
}
fixed4 frag (v2f i) : COLOR
{
fixed4 col = tex2D(_MainTex, i.texcoord);
return col;
}
ENDCG
}
}
Fallback "Mobile/Background"
}
I feel like I'm either missing something really simple, or I've stumbled across some strange bug. Can anyone lend some guidance here?
[I've also attached a package containing a sample scene that demonstrates this anomoly.][2]
[1]: /storage/temp/27288-bad+draw+order.png
[2]: /storage/temp/27289-draworder.unitypackage.zip
↧
Can Unity Compile The Shader At Runtime?
I want to change the render queue at runtime. Meanwhile, I use a built-in shader of unity, just like sprite/default.But I can not change the shader's render queue at the runtime, there is no effect on my gameObject which has a material with this shader? I wonder whether unity not compile the shader at runtime? Besides, why particles shader just like Alpha Blended can change the render queue at runtime?
↧
material.renderQueue depends on ZWrite?
I would look to render certain objects in my world so that certain ones are always on top of others. This is so that I can solve some z-fighting issues (e.g. grass will always render on top of concrete, both of which are polygons at y=0) and so that some of my HUD-like icons that are rendered in 3D will always be on top of the game world.
I'm aware of GameObject.renderer.material.renderQueue, that I can set it in the shader with something like "Queue"="Geometry+1" and that it can't be changed at runtime.
What I don't understand is its relationship with the shader's ZWrite property.
From experimenting it seems that the use of ZWrite is all-or-nothing, meaning I have to set ZWrite Off for every single shader even if I only want to force one GameObject on top of the rest.
For example, lets say I have a world where every single object uses the default renderer queue value of 2000 (which is the "Geometry" value), but I have some HUD-like-icons that I want to always be on top. So I set the renderQueue of the HUD-like-icons to 2001 (Geometry+1) and turn the ZWrite to Off (or On, it doesn't matter) but it won't work at all until I go into the shaders being used by the world objects and set ZWrite to off in there as well.
Is this correct? I just seems a bit weird to me that I have to edit every shader to ZWrite Off just so that I can ensure one particular GameObject is on top of all the others.
Thanks.
↧
↧
Render sprite only on top of a specific sorting layer
We're trying to implement a simple blob shadow in our 2D game, but the way the scene is laid out, we have the blob shadow going over certain objects in the "background" layer.
We were wondering if having a ground layer and having the blob shadow sprite render only in objects that are in that ground layer and not rendering in any other layer would be possible.
Is this something that could be done by writing a shader for the blob shadow and using sorting layers instead of render queue? Thanks!
↧
Control render order of geometry besides queues
I'm writing a voxel-renderer for particles in Unity, and the rendering flow is as follows:
i) Draw opaque scene
ii) Fill voxels (3D textures) by using PS or CS
iii) Render voxels as cubes, and ray march pixels that pass the depth test with voxel info from (ii)
I'm stuck in step (iii).
I need to do *both* front-to-back blending and back-to-front blending for the voxel cubes for the algorithm to work. If i create the cubes as game objects and tie them to a material with a shader that uses, say, `Queue = Transparent`, unity is going to sort them back-to-front and render them in that order, after the queues that precede it.
I'd like to control the exact order in which they are drawn, and I thought `Graphics.DrawMeshNow` would do that. I use it in `Camera.OnRenderObject` and from a frame capture, I see it being the first thing drawn (even before the opaque objects in step i).
I thought `OnRenderObject` would be called *only* after rendering the scene as seen by the camera, but that's not what I observe.
Any help would be greatly appreciated. Thanks for reading!
↧
How do I create an invisible mask that hides the 2nd closest object behind it?
Let's say we're in 2D view and there are 3 objects that have different z position values and partially overlap each other: InvisibleMask, ObjectA, and ObjectB. InvisibleMask is closest to the camera, then ObjectA, and then ObjectB. If I want to use InvisibleMask to only hide ObjectB where InivisibleMask overlaps ObjectB and not hide ObjectA at all, I can set the renderqueue of ObjectB to be higher than that of InvisibleMask while setting the renderqueue of ObjectA to be equal to or less than that of InvisibleMask. However, by doing so, ObjectB will appear in front of ObjectA where InvisibleMask does not overlap ObjectB, even though ObjectA is really in front of ObjectB. ObjectB now has a higher renderqueue than ObjectA, but I don't understand why this would cause ObjectB to appear in front of ObjectA when ObjectB, which also has a higher renderqueue than InvisibleMask, does not appear in front of InvisibleMask.
InvisibleMask has to be closest to the camera for reasons, so given this constraint, is what I'm trying to do even possible? It seems that I have to somehow give the same renderqueue to both ObjectA and ObjectB except for where InvisibleMask overlaps ObjectB.
↧
Unity iPhone crashed in ShaderLabs
Unity iPhone crashed in ShaderLabs. I doubt that it may be caused by changing RenderQueue of my GameObject. (Since it crashed after the modification of that part of scripts.)
Here is the call stack to trace:
0 0x0111047c in ShaderLab::Pass::ApplyPass(ShaderLab::PropertySheet const*) ()
1 0x0114b950 in Shader::SetPass(int, int, ShaderLab::PropertySheet const*) ()
2 0x010e2d30 in Unity::Material::SetPassWithShader(int, Shader*, int) ()
3 0x011bfccc in ForwardShaderRenderLoop::PerformRendering(Light*, RenderTexture*, bool) ()
4 0x011b3dbc in DoRenderLoop(Camera&, RenderLoop&, RenderingPath, std::vector>&, bool) ()
5 0x0105db5c in Camera::DoRender(void (*)(Camera&, RenderLoop&, std::vector>&, Shader*, std::string const&), bool, Shader*, std::string const&) ()
6 0x0105e540 in Camera::Render(int, Shader*, std::string const&) ()
7 0x0113bfec in RenderManager::RenderCameras() ()
8 0x0111ec18 in PlayerLoop(bool, bool) ()
9 0x01179a90 in UnityPlayerLoop() ()
And the following is the changes of my codes:
t_Prefab2= msupportclass.GameObject_ADD_Prefab_localScale(
t1,
"score_number_"+ a1.ToString(),
new Vector3( i_x, i_y, 0),
new Vector3( 90, 180, 0),
new Vector3( 0.15f, 0.15f, 0.15f));
// The following 2 lines were newly added and the program starts crashing
if( t_Prefab2!= null)
t_Prefab2.renderer.material.renderQueue= RenderQueue.Overlay;
Any ideas please?
P.S. it only crashes in iPhone/iPad and never in the Editor
↧
↧
Change render order from editor
I created simple script that changes renderQueue of object in Awake().
And now I want to see changes of renderQueue immediately in editor.
I tried to write an Editor script for it. But Unity creates temp material and shows error message saying that i must use sharedMaterial(not the renderer.material) inside editor. But it won't work for me.
Can you advice me the workaround?
And now I want to see changes of renderQueue immediately in editor.
I tried to write an Editor script for it. But Unity creates temp material and shows error message saying that i must use sharedMaterial(not the renderer.material) inside editor. But it won't work for me.
Can you advice me the workaround?
↧
Setting renderQueue doesn't appear to change draw order
This seems like such a simple thing, judging by the Unity script reference and examples here, but I can't get render queues (set in the material) to work at all. I've set up a very simple scene with two boxes and a camera, one box behind the other (see picture).
![Render test screenshot][1]
Then, I added a script that changes the render queue of the two materials, flipping the queue order. I do a debug afterwards which verifies that the renderQueue value has been changed. The only problem is that the draw order doesn't change - the nearer cube is always rendered over the far cube.
I've included my code below. As you can see from the comments, I tried setting the renderQueue in the renderer.material, sharedMaterial, and the material asset directly, and in all cases the renderQueue value changes but the render order does not. What am I missing here?
using UnityEngine;
using System.Collections;
public class ChangeMaterialQueueOrder : MonoBehaviour {
public GameObject go1;
public GameObject go2;
public Material m1;
public Material m2;
// Update is called once per frame
void Update () {
if (Input.GetKeyDown(KeyCode.F))
{
Debug.Log("Changing 1 to back");
//go1.renderer.sharedMaterial.renderQueue = 1001;
//go2.renderer.sharedMaterial.renderQueue = 1002;
//go1.renderer.material.renderQueue = 1001;
//go2.renderer.material.renderQueue = 1002;
m1.renderQueue = 1001;
m2.renderQueue = 1002;
//Debug.Log("Render queue changed; go1 renderqueue now " + go1.renderer.material.renderQueue);
Debug.Log("Render queue changed; m1 renderqueue now " + m1.renderQueue);
}
if (Input.GetKeyDown(KeyCode.G))
{
Debug.Log("Changing 1 to front");
//go1.renderer.sharedMaterial.renderQueue = 1002;
//go2.renderer.sharedMaterial.renderQueue = 1001;
//go1.renderer.material.renderQueue = 1002;
//go2.renderer.material.renderQueue = 1001;
m1.renderQueue = 1002;
m2.renderQueue = 1001;
//Debug.Log("Render queue changed; go1 renderqueue now " + go1.renderer.material.renderQueue);
Debug.Log("Render queue changed; m1 renderqueue now " + m1.renderQueue);
}
}
}
P.S.: I'm using Unity Pro.
[1]: http://fragileearthstudios.com/wp-content/uploads/2011/09/render_queue_screenshot.png
↧
Rendering Order Revisited
I know that there are plenty of questions out there related to rendering order:
* http://answers.unity3d.com/questions/14752/forcing-a-gameobject-to-the-highest-depth-closest.html
* http://answers.unity3d.com/questions/8220/rendering-order.html
and more closely related to my issue
* http://answers.unity3d.com/questions/11594/how-do-i-specify-the-render-order-of-gameobjects.html
However, I'm either misunderstanding this whole concept, or I'm doing something wrong, because I just can't get it to work at all.
What I want to do is: make sure that a certain object / material / shader is rendered on top of everything else in the scene. That is to say: set the shader to a queue sometime after all the other geometry is rendered, clear the depth buffer, and then render the object so that it appears 'written over' the rest of the scene. That way, no matter what other objects get between it and the camera it will always be drawn on top.
I know that this can be done with multiple cameras, but we run into the same problems as the last example above: trying to manage a whole bunch of cameras, framerates dropping, etc. From other folks' statements (e.g. the top example above that talks about drawing a gizmo that is inside a mesh object) it sounds like what I'm trying to do should be possible, but I've tried changing the shader tags, the renderQueue of the material, and everything else I can think of, and nothing seems to have an effect on the depth buffer.
Does the queue actually do what I'm trying to do here, or if not is there another way to do it at any level (shader, object, camera) without resorting to one camera for each object?
↧