Sunday, August 14, 2011


About 6 years ago I wrote a post about Evoak, reading it I noticed how bad my posting abilities are (not that Im good now either) because there was no exaplanation on what my goal was, it was a post of about the status of something and I suppose is difficult to valorate a milestone whenever you dont know the final goal. On that time I was really fascinated about the evoak concept but was scared of the amount of work it was needed given the Evas unstability and Edje emerge. How to code such a great idea when you know your work will be uselss two months later?

I used Evoak on my final project on the univserity, it was a real success and for me it was enviosining a different approach to the classic server-client we had on that time (and still now), X11. This topic might be taken care off on another post.

Six years later and a complete different set of libraries and tools I'm able to code again that idea. The responsible of such possibility is Ender. When I started creating Enesim, Emage and several other libraries my main goal was to actually replace some of the bits of the current EFL stack specially Evas, to be able to improve and advance with new and good technical ideas. From the current state of EFL I would say that both projects are riding its own road for now, who knows later ... The situation was that I was trying to create some kind of object system or better, a description system which will allow me to serialize properties and values easily, if I were able to create a toolkit/canvas library using only properties and values then the next step, that is, send those properties through the network, would be easy and the Evoak concept could rise again.

That moment has arrived. Everything is still on an alpha status (everything is just a proof of concept) but the idea is working. Right now, with Eon being a toolkit system with objects I can create and manipulate on a generic way (through properties, values and events) I was able to create such network bridge to serialize the widgets data through the network and create a server which shows those applications without any change. If you take a look on the ecore_remote backend of Eon and the eon_server application on the Enesim repository you'll see the actual implementation. This is a very good milestone on the idea I wanted to accomplish: bring Evoak back! :)

Friday, August 12, 2011

Enesim Renderer Compound Tutorial

Now that you know how to create renderers and draw them into a surface as seen on the previous post, now we will see how to achieve the image below only using Enesim renderers, no images are allowed! Thanks Jose for the example code.

I'll update this post as soon as I have time ...

The complete code can be found here

Enesim Tutorials

This is the first post in a series of tutorial posts for every component on the Enesim project. From Enesim itself to Eon. On this first tutorial we'll learn how to draw this sample image:

The first step is to create the renderer we want to draw. A renderer is just an abstraction of something that draws, nothing else. It can be a rectangle, a circle a gradient or even a bubble, as long as it draws something we are fine. This is the main difference between Enesim and other graphics libraries, on Enesim there's no concept of graphics primitives or texturizing, every renderer is both a primitive and a texture.

Let's begin by creating our rectangle renderer

Enesim_Renderer *r;
r = enesim_renderer_rectangle_new();

The Enesim_Renderer design in a way has an object oriented approach. Every renderer inherits from the base Enesim_Renderer type and so, you can use the generic renderer API to set generic renderer properties. Also note that a rectangle renderer also inherits from a shape and again you can use all the shape API. More on this later.

Now let's set some common properties
enesim_renderer_origin_set(r, 15.0, 15.0);
enesim_renderer_rop_set(r, ENESIM_BLEND);

In the above code we are setting the renderer origin to 15 on the X axis and 15 on the Y axis. The important thing here is to understand that every renderer has it's own coordinate system and by setting its origin you are actually translating its center by 15 units.

The rop property is the raster operation. Setting it, informs the system how it should be drawn on a target surface. In this case we are waying that we want to blend the renderer with what is already on the surface.

Now lets set some rectangle properties
enesim_renderer_rectangle_width_set(r, 128);
enesim_renderer_rectangle_height_set(r, 128);
enesim_renderer_rectangle_corner_radius_set(r, 15);
enesim_renderer_rectangle_corners_set(r, EINA_TRUE, EINA_FALSE,

The rectangle renderer can have rounded corners, with the above code we are setting the top left corner and bottom right corner to be rounded with a radius of 15. Note that the rectangle renderer uses the origin property as the start of the rectangle, that means that right now we will draw a rectangle of with 128 and height 128 starting at 15 on the X axis and 15 on the Y axis. But what about filling? and stroking?

Given that a rectangle inhertis from a shape, you can use all the shape API to set more properties.
enesim_renderer_shape_fill_color_set(r, 0xff00ffff);
enesim_renderer_shape_stroke_color_set(r, 0xffff0000);
enesim_renderer_shape_stroke_weight_set(r, 3.0);
enesim_renderer_shape_draw_mode_set(r, ENESIM_SHAPE_DRAW_MODE_STROKE_FILL);

So now we are setting the fill color to ARGB(255, 0, 255, 255) the stroke to ARGB(255, 255, 0, 0) and the width or weight of the stroke to 3. Even if those properties are set, it does not mean that we will actually draw with such fill/stroke configuration. We need to set the drawing mode we are going to use, being it fill only, stroke only or both. For this case we will use stroke and fill. Note that we are not using any alpha different than 255 (i.e full opaque) to simplify this description. Every color used on Enesim has to be premultiplied by its alpha, but no worries about that, we do have some helper functions that transform an ARGB value to a premultiplied ARGB value.

Now that we have all the renderer set up, we want to actually draw it. For that we create an Enesim_Surface. A surface is just a pixel area where you can actually draw your renderer. On some systems this is called Drawable, on other Pixmaps or Textures.

Enesim_Surface *s
s = enesim_surface_new(ENESIM_FORMAT_ARGB8888, 256, 256);

So, you have created an ARGB surface with 8 bits per component of size 256 x 256. Now the real drawing

enesim_renderer_draw(r, s, NULL, 0, 0);

Note that we haven't set all the parameters for simplicity. But I will explain them too. The first argument of course is the renderer you want to draw and the second one the target surface you want to draw, those were easy. The third one defines the clipping area you want to draw on the destination surface and the last two is the origin of the destination surface. Remember that every renderer has it's own coordinate space? in this case a surface does have it too. It is always from 0 on the X axis and 0 on the Y axis and the width and height are the same as the surface width and height. With the two last arguments you can translate the surface origin, but to make this initial tutorial simple we wont describe the whole functionality, yet! :)

The complete code can be found here

Saturday, August 6, 2011

Enesim does OpenCL

This was something I had in mind for a long time, not neccessary OpenCL but an usable way to make a "backend" system for renderers and surfaces/buffers. That moment has arrived. Right now the enesim code has a well defined backend system and an incipient OpenCL support. Given the nature of a renderer each one should provide each own OpenCL algorithm but at least Enesim provides a good abstraction to define such algorithms and interact with them.

The pool subsytem has been refactored in a way that a pool should create surfaces or buffers and define its backend type. Given that, once a renderer element wants to draw into a surface, we can call the correct functions based on the backend and the associated data of a surface.

Still there are things to do on Enesim and specifically on other subsystems of the library (like being able to make converters use also the backend system, like having a YUV422 -> ARGB converter on OpenCL or stuff like that), but is a very good step on the road of the 1.0 release. The only implementation of an OpenCL renderer is done on the "background" renderer given its simpleness, but is a good start to understand how OpenCL works and what common arguments are needed for every renderer.

Later, with this approach, we can easily add an OpenGL shader approach for renderers and a texture based software pool :)