

Whenever you render a texture, you need to send a command to the GPU driver. It does make a huge difference if you are rendering many of them (like grass in a terrain, or particles). You guessed it, it’s closely related to performance. SpriteRenderer creates two submeshes, one per island however, UI Image extends the rect so it covers the whole image. Very interesting results in the third example. an image containing different figures separated by transparent areas? And what happens now, if we import a PNG with islands, e.g. One could argue if the trade-off is beneficial or not. But the mesh looks much more complicated now, why is that? Unity tries to fit the sprite the best it cans without introducing too many polygons, so that is the result we get. Likewise, the same happened in example 2. It might help seeing the differences with concrete examples. Lastly, both can be used with sprite atlases in order to reduce draw calls. The reason for creating meshes will be explained in the next section we will see how important it is, as it has an important performance impact.

One of the key differences between sprites and images is that sprites support the automatic creation of meshes out of it, where UI images consist always of rectangles. As you might have guessed, it is relatively expensive to draw them on mobile.

UI Images are also rendered in the transparent geometry queue (Render.TransparentGeometry), unless you use the overlay rendering mode in which case it will be rendered through Canvas. You can position sprites just like all other objects through its transform, but images will use a RectTransform instead so as to help positioning the image in your interface system.Sprites are rendered in the transparent geometry queue (unless you use another material than the default). UI Images, on the other hand, have to be inside a canvas (a GameObject having a Canvas component). When it comes to the hierarchy, you can place sprites wherever you want in your scene. Comparison: SpriteRenderer vs CanvasRenderer Mark it as sprite (2D and UI) as shown in the screenshot below. Just drop the desired image (in PNG preferably) in the assets folder and click on it to access the inspector settings. It is straightforward to use sprites in Unity. Sprites are images that will be rendered in either your 2d/3d scene or in your interface. They are not directly applied to meshes like textures, but rather on rectangles/polygons (at the end, they are meshes too, so not such of a big difference). Sprites are basically semi-transparent textures that are imported as sprites in its texture import settings. This blog entry will be based on Unity 5.3.4f1. In this post you will find a more detailed version of the original slides I prepared for it. I didn’t find much information about it so I decided to prepare a presentation in my company to help making it clear. While working in a project for one of our clients, I was asked about the difference between sprites (SpriteRenderer) and UI images (CanvasRenderer) in Unity.
