Can you set GUI.depth on textures drawn using DrawTextureWithTexCoords?
I have a GUI.Box which I would like to render in front of the texture.
[QUOTE=war_man333;47092719]Can you set GUI.depth on textures drawn using DrawTextureWithTexCoords?
I have a GUI.Box which I would like to render in front of the texture.[/QUOTE]
Someone correct me if I'm wrong but I'm pretty sure you can just call the GUI.Box() after the DrawTexWithCoords. Stuff called first is drawn first. I exploit this to put borders on some of my scrollviews so the contents don't blend with the surrounding UI at the edges.
Just updated the shield a bit; it's now invisible when there's nothing for it to do. It can also now break, and then regenerate itself after a short time.
I've also made it fairly configurable with several useful parameters in the inspector. Nothing to control those in the demo yet, so you'll just have to trust me that it works!
(remember to Ctrl-F5 if you've opened this before, to make sure you have the newest version!)
[url]http://backwardspy.github.io/flite.html[/url]
[QUOTE=Pelf;47092960]Someone correct me if I'm wrong but I'm pretty sure you can just call the GUI.Box() after the DrawTexWithCoords. Stuff called first is drawn first. I exploit this to put borders on some of my scrollviews so the contents don't blend with the surrounding UI at the edges.[/QUOTE]
Only I can't, the way my inventory works... so if I could just set the depth directly, that'd be cool.
How do you play a sound and delete the sound's GameObject at the same time?
[code]
deathSoundSource.Play();
Destroy(gameObject);
[/code]
This results in the sound not being played.
[code]
deathSoundSource.Play();
Destroy(gameObject, deathSoundSource.clip.length);
[/code]
This results in the object not being destroyed until the sound has stopped playing.
I have the same issue with particles.
You ~could~ spawn a new object at the location of the destroyed object and destroy that after its done, but this is not the optimal solution of course.
[QUOTE=Sithramir;47098936]You ~could~ spawn a new object at the location of the destroyed object and destroy that after its done, but this is not the optimal solution of course.[/QUOTE]
Isn't it? That's what I do
Doesn't Unity have an audio function that does just that? I think it was [I]audio.PlayOneShot()[/I] or something.
[B]Edit:[/B]
It's actually [url=http://docs.unity3d.com/ScriptReference/AudioSource.PlayClipAtPoint.html][i]audio.PlayClipAtPoint[/i][/url].
[QUOTE=LuckyLuke;47099085]Doesn't Unity have an audio function that does just that? I think it was [I]audio.PlayOneShot()[/I] or something.
[B]Edit:[/B]
It's actually [url=http://docs.unity3d.com/ScriptReference/AudioSource.PlayClipAtPoint.html][i]audio.PlayClipAtPoint[/i][/url].[/QUOTE]
Can't use that if I want to change the pitch of the audiosource afaik.
[QUOTE=Sithramir;47098936]You ~could~ spawn a new object at the location of the destroyed object and destroy that after its done, but this is not the optimal solution of course.[/QUOTE]
How?
[code]
var audioClone = Instantiate(new GameObject(), transform.position, Quaternion.identity) as SingleSound;
audioClone.audioSource = deathSoundSource;
audioClone.PlayAndDestroy();
[/code]
[code]
public class SingleSound : MonoBehaviour
{
public AudioSource audioSource;
public void PlayAndDestroy()
{
audioSource.Play();
Destroy(gameObject, audioSource.clip.length);
}
}
[/code]
[QUOTE=war_man333;47099117]Can't use that if I want to change the pitch of the audiosource afaik.
How?
[code]
var audioClone = Instantiate(new GameObject(), transform.position, Quaternion.identity) as SingleSound;
audioClone.audioSource = deathSoundSource;
audioClone.PlayAndDestroy();
[/code]
[code]
public class SingleSound : MonoBehaviour
{
public AudioSource audioSource;
public void PlayAndDestroy()
{
audioSource.Play();
Destroy(gameObject, audioSource.clip.length);
}
}
[/code][/QUOTE]
I'd just create a prefab, give it an AudioSource component, and then do something like this;
[code]
// Main Script
var soundObj = Instantiate(objPrefab, Vector3.zero, Quaternion.identity) as GameObject;
var audioComp = soundObj.GetComponent<AudioSource>();
audioComp.clip = myClip;
audioComp.pitch = 1.0f;
audioComp.Play();
yield return new WaitForSeconds(myClip.length);
Destroy(soundObj);
[/code]
There's probably a much better way to do this, but I'd just make a coroutine for it, assign the clip, adjust the pitch, play the clip, then wait for the length of the clip, and then destroy it.
This is what AudioSource.PlayClipAtPoint does
[csharp]public static void PlayClipAtPoint(AudioClip clip, Vector3 position, float volume)
{
GameObject gameObject = new GameObject("One shot audio");
gameObject.transform.position = position;
AudioSource audioSource = (AudioSource)gameObject.AddComponent(typeof(AudioSource));
audioSource.clip = clip;
audioSource.volume = volume;
audioSource.Play();
Object.Destroy(gameObject, clip.length * Time.timeScale);
}[/csharp]
[IMG]https://40.media.tumblr.com/8510ff6c6f2bcd9fed11120287083d45/tumblr_njgvqknRJZ1tsne69o2_1280.png[/IMG]
[IMG]https://36.media.tumblr.com/5ab36a07a0d06bbb5da51b4376ebd229/tumblr_njgvqknRJZ1tsne69o3_1280.png[/IMG]
[IMG]https://40.media.tumblr.com/08703aa3bea45c22a0a3b5bd40179370/tumblr_njgvqknRJZ1tsne69o4_1280.png[/IMG]
Vroom: a multiplayer online kart racing game. Behold the majesty of 800x600.
I modified Unity's standard self-illumination shader to support an arbitrary color for the glow color and be able to dynamically adjust how bright the self illumination is.
I know very little about shaders, so this is super satisfying to see. It's the little things in life.
[vid]http://a.pomf.se/iirpgd.webm[/vid]
In case somebody else was ever frustrated at the Self-Illumination shader's lack of this feature, here's the shader:
[code]Shader "Self-Illumin/Diffuse Variable" {
Properties {
_Color ("Main Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB) Gloss (A)", 2D) = "white" {}
_IllumAmount ("Illumin Amount", Range (0, 1)) = 1
_IllumColor ("Illum Color", Color) = (1,1,1,1)
_Illum ("Illumin (A)", 2D) = "white" {}
_EmissionLM ("Emission (Lightmapper)", Float) = 0
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
#pragma surface surf Lambert
sampler2D _MainTex;
sampler2D _Illum;
fixed4 _Color;
fixed4 _IllumColor;
float _IllumAmount;
struct Input {
float2 uv_MainTex;
float2 uv_Illum;
};
void surf (Input IN, inout SurfaceOutput o) {
fixed4 tex = tex2D(_MainTex, IN.uv_MainTex);
fixed4 illum = tex2D(_Illum, IN.uv_Illum);
fixed4 c = tex * _Color;
fixed4 ce = tex * _IllumColor;
o.Albedo = c.rgb;
o.Emission = ce.rgb * illum.a * illum.a * _IllumAmount;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Self-Illumin/VertexLit"
}
[/code]
[media]http://www.youtube.com/watch?v=UE0shkEzW6w[/media]
Sorry for totally ripping off your trails Why485, I promise I'll get my own eventually! :v:
I came across a pretty big gotcha today in Unity.
I was messing around with a floating origin, and the performance hit when moving everything was pretty massive. I thought it would be a heavy hit, but not by that much.
I ran the profiler to see what it was, and ~80ms of time was being taken when PhysX was baking something. After a bit of research and head scratching, it turns out that if you set a mesh's position with a collider and the mesh itself's scale is NOT 1, 1, 1, PhysX will rebake it for something and cause a noticeable drop in frametime.
After fixing it, the time it took to re-center the origin dropped from 80ms down to 7ms, and half of that 7ms is just printing debug text to the output window.
TL;DR: If you have a mesh in Unity with a collider on it, make sure that the mesh's scale is set to 1,1,1. If not, PhysX will bake something when you move it, causing a very heavy performance hit.
[editline]8th February 2015[/editline]
Not a problem! I posted it with the intent that you or anybody else would find it useful and build off of it.
[QUOTE=Why485;47103455]I came across a pretty big gotcha today in Unity.
I was messing around with a floating origin, and the performance hit when moving everything was pretty massive. I thought it would be a heavy hit, but not by that much.
I ran the profiler to see what it was, and ~80ms of time was being taken when PhysX was baking something. After a bit of research and head scratching, it turns out that if you set a mesh's position with a collider whos scale is NOT 1, 1, 1, PhysX will rebake it for something and cause a noticeable drop in frametime.
After fixing it, the time it took to re-center the origin dropped from 80ms down to 7ms, and half of that 7ms is just printing debug text to the output window.
TL;DR: If you have a mesh in Unity with a collider on it, make sure that the mesh's scale is set to 1,1,1. If not, PhysX will bake something when you move it, causing a very heavy performance hit.
[editline]8th February 2015[/editline]
Not a problem! I posted it with the intent that you or anybody else would find it useful and build off of it.[/QUOTE]
Only the actual 3D imported mesh I assume? Not the GameObject's scale?
[QUOTE=Asgard;47106224]Only the actual 3D imported mesh I assume? Not the GameObject's scale?[/QUOTE]
Yes. That's something that in your modelling program you need to double check before exporting. In 3DS Max you need to go through all your objects and in the hierarchy tab reset the scale. In Blender, the equivalent is going to Object -> Apply -> Scale.
I've noticed in other engines before that things get weird if that scale isn't correct, so it's not surprising that Unity also gets confused. At this point I'd say it's just good modeling practice to make sure that your scale is set to 100% or 1, 1, 1 before exporting for use in another engine.
Physx also re-bakes collision data for all objects, per object, if you move an object that does not have a rigidbody, but does have a collider.
Hey guys, do you have any tips on how to write clean and efficient code?
Thanks!
name your variables based on their purpose, and be consistent in naming in general, don't just name your functions and vars like "asdasdas"
Guys i've wrote my first shader and i know what i've wrote...Srsly guy i'm almost crying ;-;
[img]http://i.imgur.com/iT9IMAU.png[/img]
[URL="http://pastebin.com/LkneTUf9"]If anybody it's interesed on a alpha map colouring...Here is your shader[/URL]
Okay...
[t]http://i.imgur.com/mk7mtmz.png[/t]
I'm writing a custom trail renderer because the built in one isn't very robust, and won't let me override the positions of the points on the trails.
Mine is using the LineRenderer because it's pretty close to what I need and I've got something that works pretty well. It allows me to manually move the trail (important for when the origin resets) and I can impart a velocity on the trail elements because that looks cool.
However, it breaks down when you stop emitting trail elements and then start back up again. Due to the nature of the LineRenderer, it'll draw a line between the "end point" of the last segment and the "start point" of the new segment. This is unfortunately, working as intended, as it can't tell the difference between the start and end. It only draws lines between points.
So, I'm at an impasse with this. I don't think the LineRenderer is the way to go, and I'm probably going to have to start drawing and rotating my own quads in order to make a custom trail renderer work the way I'd like it to.
Anybody have any better ideas before I start looking into creating my own meshes on the fly?
[vid]http://a.pomf.se/poaxep.webm[/vid]
What about BetterTrails? It's on the asset store I believe.
-snip, did something else-
[QUOTE=Why485;47110490]I'm writing a custom trail renderer because the built in one isn't very robust, and won't let me override the positions of the points on the trails.
Mine is using the LineRenderer because it's pretty close to what I need and I've got something that works pretty well. It allows me to manually move the trail (important for when the origin resets) and I can impart a velocity on the trail elements because that looks cool.
However, it breaks down when you stop emitting trail elements and then start back up again. Due to the nature of the LineRenderer, it'll draw a line between the "end point" of the last segment and the "start point" of the new segment. This is unfortunately, working as intended, as it can't tell the difference between the start and end. It only draws lines between points.
So, I'm at an impasse with this. I don't think the LineRenderer is the way to go, and I'm probably going to have to start drawing and rotating my own quads in order to make a custom trail renderer work the way I'd like it to.
Anybody have any better ideas before I start looking into creating my own meshes on the fly?[/QUOTE]
I could be miles off here because I don't know how your implementation works; but each time you disable the trail could you not set a flag that tells it the next point will be the start of a new trail?
I'm thinking I'm doing this in a completely retarded fashion.
I want an animator where every animation can go to every animation.
These transition arrows going everywhere is already complex as hell as is, so what's a good way around this? This is a terrible state machine.
[img]http://i.imgur.com/reosilB.png[/img]
[QUOTE=war_man333;47112599]I'm thinking I'm doing this in a completely retarded fashion.
I want an animator where every animation can go to every animation.
These transition arrows going everywhere is already complex as hell as is, so what's a good way around this? This is a terrible state machine.
[img]http://i.imgur.com/reosilB.png[/img][/QUOTE]
There's a blue state barely visible in your image called "Any state", doing a transition from that state means that the transition can happen from any state in the state machine.
Sorry, you need to Log In to post a reply to this thread.