The history of Cocos2d in a glimpse

Cocos2d (Python)

In February 2008, at Los Cocos, Córdoba, Argentina, we started the “Los Cocos” Python game engine. We later renamed it Cocos2d. The idea was to create a game engine for the games we were creating for PyWeek.

PyCamp 2008. Centro Allen Gardiner, Los Cocos, Córdoba, Argentina

We started the game engine with Lucio Torre, Daniel Moisset, Rayentray Tappa, and I, with the help of Alejandro Cura and other members of PyAr.

Continue reading “The history of Cocos2d in a glimpse”

Integrating LiquidFun with Cocos2d-x: Part II

In Part I I described to how integrated LiquidFun with Cocos2d-x.
In this part (part II) I’ll describe how to render the particles using a basic water effect.

LiquidFun + Cocos2d-x
LiquidFun + Cocos2d-X using Render-To-Texture technique to simulate water.

Part I uses just one glDrawArrays(GL_POINTS, 0, total); to draw the particles. And although that works to draw “particles”, it is not enough to draw “water”.

Drawing “water” requires a more complex rendering algorithm, like the one used in this example. And implementing an algorithm similar that one is what this article describes.

The algorithm works more or less like this:

  • Choose a white circle and blur it.
    • You can blur the circle at runtime
    • Or you can blur it off-line.
  • Create an a new frame-buffer (think of a clean off-screen buffer where you can render whatever you want)
  • Render the particles into the newly created frame-buffer using the blurred circle
  • Now render the frame-buffer into the main color-buffer using a threshold. The threshold could be something like this:
    • If pixel.r < 0.1, discard the pixel (the pixel won’t be drawn)
    • If pixel.r < 0.2, draw a blue pixel (for the border, although this is optional)
    • else draw a white pixel (the inner part of the water)

How to do it using Cocos2d-x and LiquidFun

Let’s take the LFParticleSystemNode from Part I, and “evolve” it:

The first thing to do is to add the “off-screen” frame-buffer into the LFParticleSystemNode class. In Cocos2d-x, the “off-screen” buffers are created with the RenderTexture class. Example:

bool LFParticleSystemNode::init(b2ParticleSystem* particleSystem, float ratio)
    // create an off-screen frame-buffer with the size of the screen
    auto s = Director::getInstance()->getWinSize();
    _renderTexture = cocos2d::RenderTexture::create(s.width, s.height, Texture2D::PixelFormat::RGBA8888);
    _renderTexture->setPosition(Point(s.width/2, s.height/2));

    // Change the default shader. Use a the threshold shader
    auto program = GLProgram::createWithByteArrays(_renderTextureShaderVert, _renderTextureShaderFrag);
    auto programState = GLProgramState::getOrCreateWithGLProgram(program);
    programState->setUniformFloat("u_threshold_discard", 0.15);
    programState->setUniformFloat("u_threshold_border", 0.3);


And, as mentioned earlier, the RenderTexture (the off-screen frame-buffer) needs a shader with a threshold. The threshold shader should look like the following:

varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
uniform float u_threshold_discard;
uniform float u_threshold_border;

void main()
    vec4 color = v_fragmentColor * texture2D(CC_Texture0, v_texCoord);
    if( color.r < u_threshold_discard)
        // black or discard
        color = vec4(0,0,0,0);
    else if( color.r < u_threshold_border)
        // blue for the border
        color = vec4(0.2,0.2,0.9,1);
        // white for the center
        color = vec4(1,1,1,1);
    gl_FragColor = color;

The values u_threshold_discard, and u_threshold_border are defined at runtime. In the example, they are set at 0.15 and 0.3 respectively.

The next thing to do is, to render the particles in the RenderTexture.

void LFParticleSystemNode::draw(Renderer *renderer, const Mat4 &transform, uint32_t transformFlags)
    // tell RenderTexture to "capture" the particles

    _customCommand.func = CC_CALLBACK_0(LFParticleSystemNode::onDraw, this, transform, transformFlags);

    // tell RenderTexture to stop "capturing" the particles

 The result is the following

Comparing simple glDraw() with glDraw() + RenderTexture
Left: Without RenderTexutre.  Right: RenderTexture + custom shader with threshold

Continue reading “Integrating LiquidFun with Cocos2d-x: Part II”

WWDC 2014 announcements as a game developer

At WWDC 2014, Apple announced new features for iOS and OS X. These are my thoughts:



Swift is a new programming language by Apple.

At first sight, it seems to be easier to learn and easier to master than Objective-C. Objective-C is not particular difficult to learn and master, but its syntax looks foreign to C# / C++ / Python developers.

Swift, on the other hand, has a more conventional design. You can read Swift code the same way you can read C# code, even if you are not a Swift or C# developer.

Swift is a compiled language, although it looks like an scripting language. It is strongly typed, it is object-oriented with functional features. It does not have garbage collection. It uses ARC instead.

You can call any Objective-C API from Swift (at least Apple’s APIs) , and Apple claims it is faster than Objective-C. Objective-C wasn’t the fastest language out there, but it wasn’t particularly slow either.

Perhaps the killer feature for me is Playground, a kind of sandbox for testing ideas / rapid development / rapid prototyping. BTW, Playground seems to be inspired (or copied if you prefer) from Bret Victor’s Inventing on Principle talk, which is a MUST WATCH video for everybody.

Also, Swift has pretty much what John Siracusa asked for in his Copland 2010 article.

So, if Swift is easier to learn, easier to master, less error-prone, faster to develop code, performs better than Objective-C and you can call Objective-C code from it, why Apple should keep adding features to Objective-C ?

I expect that:

  • Objective-C code will be supported on iOS / OS X for the foreseeable future.
  • But new APIs will be added on Swift only. Developers will be forced to migrate to Swift to use the new ones (similar to what happened years ago with Carbon vs. Cocoa APIs ).


  • Can you call any Objective-C library from it ? or only Apple’s APIs ? Will Apple release the binding generator ? [UPDATE]: Yes, it is possible to call 3rd party Obj-C libraries from Swift.
  • Will Apple open source the language ? Or at least submit the language to the standard committee ?
  • Can you call C and/or C++ libraries from it ?

For me:

  • Swift is a very attractive language, so if Apple decides to open source it, it has the potential to gain a lot of developers from other platforms as well. I would definitely use it, and would seriously analyze the possibility of porting cocos2d to it.
  • Bret Victor’s Inventing on Principle was very inspiring. Since the day I watched that video, I wanted to add similar features to cocos2d. Playground showed us that it is possible to do it with a compiled language.

Continue reading “WWDC 2014 announcements as a game developer”

Integrating LiquidFun with Cocos2d-x: Part I

LiquidFun Testbed + Cocos2d-x
LiquidFun Testbed + Cocos2d-x

From LiquidFun’s site:

Based on Box2d, LiquidFun features particle-based fluid simulation. Game developers can use it for new game mechanics and add realistic physics to game play. Designers can use the library to create beautiful fluid interactive experiences.

Basically LiquidFun is Box2d plus an extension to simulate fluids using a particle system. To test it, download and install the official LiquidFun – Testbed, and LiquidFun – EyeCandy for Android.

Cocos2d-x already has Box2d integration, so in order to integrate Cocos2d-x with LiquidFun, we only need to integrate this new class: b2ParticleSystem.

LiquidFun’s b2ParticleSystem

I’m not going to describe how to use LiquidFun (for that, read its programmers guide). Instead, I’m going to describe how to integrate b2ParticleSystem in Cocos2d-x (also applicable to any other game engine).

For the integration, what we need is a Cocos2d-x node that knows how to render a b2ParticleSystem. And b2ParticleSystem has these 4 useful methods:

class b2ParticleSystem {
  // Get the number of particles.
  int32 GetParticleCount() const;

  // Get the particle radius.
  float32 GetRadius() const;

  // Get the position of each particle in Box2d's coordinate system
  // Array is length GetParticleCount()
  b2Vec2* GetPositionBuffer();

  // Get the color of each particle in RGBA Uint8 format.
  // Array is length GetParticleCount()
  b2ParticleColor* GetColorBuffer();

Ideally we should be able to reuse cocos2d::ParticleSystemQuad for the rendering, but we can’t because:

  • cocos2d::ParticleSystemQuad doesn’t support changing the attractor (this is a design bug, we need to fix it). A nil attractor would be needed for this case.
  • ParticleSystemQuad works with Quads, and not Points. And even if Points were supported (like in Cocos2d-x v1), it wouldn’t work because the points and colors should be in an interleaved array.
  • The other issue is the conversion between Box2d and Cocos2d-x coordinate system, but it would be easy to fix.

Continue reading “Integrating LiquidFun with Cocos2d-x: Part I”

Installing git

So you have Windows 8.1 + Visual Studio 2013 installed. Now you need to install a git client.

My workflow in Mac is:

  • I use git command line about 70% of the time.
  • In the reaming 30% I’m using Tower, Kaleidoscope and Xcode.

So, I was looking for something similar for Windows. And so far, this is my current setup:

  • Mysysgit, for git command line.
  • SourceTree for GUI
  • I couldn’t find a good stand-alone diff-viewer, so I’m using SourceTree’s

What I like about Mysysgit is that it installs a Unix-like shell, with git auto-completion and you can also see the current branch in the shell prompt. That is very handy.


SourceTree is also a pretty good, advanced GUI client for git. I used it a lot in Mac before switching to Tower.

In order to have both Mysysgit and SourceTree working at the same time with your own github repositories, you have to do:

a) Create an ssh key from the git shell by running: ssh-keygen


b) From SourceTree -> Tools -> Options,  import the newly created key. Make sure you select the “OpenSSH” option, and not “Putty”.source-tree-key-config

c) Then, what you have to do is to add the public key in Github.

An alternative option to Mysysgit + SourceTree is to use Github for Windows. But I didn’t like it, its GUI is too basic for my needs.

It is worth noting that Visual Studio 2013 Pro (not Express) comes with built-in git support. It is similar to Xcode’s git support.

Sapus, cocos2d y ajedrez

Guau… ya paso un mes desde el ultimo post ?
Estuve dandole a cocos2d para iPhone. En este ultimo mes saque 3 version de cocos2d para iPhone, quizas lo mas importante es que empece con la serie v0.6.x que viene con varias mejoras en la API y con el feature de Parallax Scrolling.
Ademas saque con Sapus Tongue Lite, la version gratis de Sapus Tongue que tiene todos los features salvo el de subir scores al server.
Y como ultimo, pero no menos importantes, volvi a las pistas de ajedrez. Estoy jugando el torneo del CSM… las aperturas las tengo medio olvidadas pero lo demas mas o menos lo tengo fresco.

Dessert Dilemma: nuevo jueguito

Además de estar haciendo cocos2d y Sapus Tongue, estas últimas semanas estuve entretenido haciendo otro juego para Cocoa Touch Games: Dessert Dilemma.

El juego consiste en llevar el postre a una parte del mantel, esquivando distintos objectos como tenedores, cuchillos, cucharas y platos.

El genero es una suerte de puzzle + arcade, ya que el juego mide el tiempo y no los movimientos hechos.

Espero que les guste. Hoy o mañana David (de Cocoa Touch Games) lo estará subiendo al App Store y supongo que en 5 o 6 días estará disponible.

Obviamente, lo hice con cocos2d para iPhone.

Tools usadas para el desarrollo de Sapus Tongue

  • a: MacBook, Xcode, cocos2d, pgu tile editor, Simulator, Firefox, Gmail, Gimp, iMovie, Audacity, afconvert, iTunes, Adium, más.
  • b: cuaderno cuadriculado y una rollerball roja: prototipos, ideas, cuentas, sketches y más prototipos: 6 hojas de dibujos aprox.
  • c: iPod Touch 1g, iPod Touch 2g, iPhone 1g: pruebas, y más pruebas
  • d: filmadora (prestada, gracias Mana y Jose): para el video de instrucciones
  • e: iPod 3g: fundamental escuchar a la Mona Jimenez para hacer la parte de bug fixing y tunning mas amena
  • f: libros de OpenGL y física. El libro de AI no lo usé para este juego.
  • g: cuentas que tengo que pagar