Monocicleando en Water Dog con los muchachos
Dragón, Tom, Ned y yo en monociclo y Mateo en Bici. Fue un lindo paseo.
Dragón, Tom, Ned y yo en monociclo y Mateo en Bici. Fue un lindo paseo.
VChar64 v0.0.4 released! And thanks to Henning Bekel, VChar64 now supports xlink. Changes since v0.0.2: Added xlink support. For more info: http://henning-bekel.de/xlink/ Added Export As… Export will export the current project with the previous exported values Added Tile support
What was interesting about the Keynote: Brillo: the OS for the IoT… but but but, there were no talks about it, almost no information about it, nothing. Android Studio with C++ support: Finally :) The NDK really needs love, and having an IDE that supports it is great. Hey, even Microsoft is supporting the NDK now in VS2015. Photos is decoupled from Google+ with free unlimited storage: This is great. I’ve been using Picasa since day one and I never used Google+ to store my photos. So basically Photos is the same as Picasa, but with unlimited storage. Offline maps: Yeah Chrome Custom Tabs: Interesting alternative to present 3rd party views with the benefits of both the web and native worlds. What was not-that-interesting about the Keynote: The rest. I don’t care if Android has better permissions or not (yeah, the old permissions-model sucked, but I don’t find that news interesting). The Family section for Google Play is good, but not interesting. Android Pay, meh. etc. One thing that I liked, but was not announced on the Keynote, was Project Jacquard. They are using conductive threads and other stuff to create “smart” cloths. Something that Adafruit and Sparkfun have been doing for a while, BTW.
I’m good at software engineering, but in electronics, I’m a newbie. Nonetheless, I find electronics fascinating. Last year I did the first basic tutorials with Arduino, then I played a little bit with Raspberry Pi and CI20 (technically not really electronics). And this year, I built a very simple circuit to connect the RGBI output of my Commodore 128 to VGA… I have been using breadboard, so no soldering, no PCB, or anything like that. ...
Who has time to maintain two personal blogs ? It is even difficult to maintain just one. So what I did was to merge my unicycle ( monociclo.com.ar) blog into this one ( towp8.com). I divided them in categories: The unicycle blog is using the " unicycle" category. This blog is using the " programming" category
I started to code a unicycle game for the Commodore 64. The first thing that I realized was that I needed an game editor, so I started to code one: VChar64. Today I released v0.0.1 which has basic functionality but the functionality that it has, just work. You can download it from here: Download: https://github.com/ricardoquesada/vchar64 Features: Imports raw, PRG and CharPad file formats Exports to raw and PRG Basic editing functionality: Rotate, Invert, Clear, Shift left/right/up/down Multiplatform: Win, Linux and Mac Screenshot: ...
I stopped developing for the Commodore 64 in 1993. Since then a lot has happened: Back in late 80’s ~ beginning of 90’s: I did all my coding using the Commodore 128’s MONITOR command That means no text editor, no compiler, no linker. Similar to the debug.com command that used to be in DOS Since I didn’t use a text editor, I put all my comments in a notepad (I still have that notepad somewhere) I used the Commodore 128’s SPRDEF as the Sprite editor I used my own character editor called vchar… (later I created a similar one for DOS and Linux) I did some basic graphics using a graphics editor… but I can’t remember which one I didn’t know any other C64 developer, so I did everything kind of isolated My sources of information were Commodore Magazine, Tu Micro Commodore and some books I reversed engineer some games / demos in order to learn tricks I had a 300 bps modem but I didn’t find any good C64 BBS I did some cracks for a local company that was “publishing” (AKA pirating) games. In exchange they were providing me games. To put things into perspective it was impossible (I mean IMPOSSIBLE) to get original games in Argentina back then. I knew some basic tricks like how to use more than 8 sprites, how to open the top and bottom borders, some raster effects… but nothing very advanced. I loaded all my programs / games using the disk drive, which was much faster than the datasette, but still very slow I had a fast-loader cartridge to accelerate the disk drive loading times. It also had a rudimentary MONITOR. Although Argentina was using the PAL-N standard I had a NTSC Commodore 128. In Argentina we also had the Argentinean Commodore, called Drean Commodore, which was a PAL-N machine assembled in Argentina
For the past 5 or 6 years, I used iPhones as my default phones. But a few months ago I decided to switch to Android. I tried Android devices before, but never as my default phone. At the beginning I started using a Samsung Galaxy S4 (a 2013 5" device), but later I switch to a Xiaomi MI4 (a 2014 5" device with better specs). Without further ado, this is my feedback: Launcher In case you don’t know what the launcher is, think of the shell that allows you to launch the applications. It is the first thing that appears when you turn on the phone. In a way, it defines the UX. Each phone maker customize the Android launcher according to its needs. And you can also download 3rd party launchers. This is both a good thing and a bad thing. The bad thing is that every phone maker have a different launcher, making it difficult to switch to other Android devices, since the UX is different. Google has its own launcher called Google Now Launcher, and tries hard to make sure that Android makers don’t differ too much from it, although that is not always true. Samsung, as an example, ships its phones with a launcher called TouchWiz. It doesn’t differ that much from Google Now Launcher, but its changes make it a worse phone, not a better one. Xiaomi, on the other hand, ships its phones with a completely different launcher which makes your Android device behave like an iOS device. The good thing is that you can use a different launcher if you don’t like the default one (or create your own). This as a good thing because everything that can be configured, changed or replaced is an opportunity for innovation (see below).
Lo primero que pasó es que en Octubre 2014 fui padre y eso significa, además de tener una enorme felicidad, es que ya no tengo tanto tiempo como antes. Y eso significa que tuve menos tiempo para andar en monociclo y menos tiempo para escribir en el blog. Pero para hacer un resumen, esto es lo que pasó: Fui a Unicon XVII en Canada, y participé en Basket “A”, en Hockey “B”, en Muni Downhill “Advanced”, en Muni Uphill, en Muni Cross-Country “Advanced”. Muy muy divertido, en Agosto 2014 Hice el San Francisco Uni Tour en Septiembre 2014 Fui al California Muni Weekend en Los Angeles en Octubre 2014 Y seguí jugando as basket todos los martes en Berkeley Y traté de hacer Muni una vez por semana, aunque a veces no se podia. Fotos, aca. ...
In Part I I described to how integrated LiquidFun with Cocos2d-x. In this part (part II) I’ll describe how to render the particles using a basic water effect. Part I uses just one glDrawArrays(GL_POINTS, 0, total); to draw the particles. And although that works to draw “particles”, it is not enough to draw “water”. Drawing “water” requires a more complex rendering algorithm, like the one used in this example. And implementing an algorithm similar that one is what this article describes. The algorithm works more or less like this: Choose a white circle and blur it. You can blur the circle at runtime Or you can blur it off-line. Create an a new frame-buffer (think of a clean off-screen buffer where you can render whatever you want) Render the particles into the newly created frame-buffer using the blurred circle Now render the frame-buffer into the main color-buffer using a threshold. The threshold could be something like this: If pixel.r < 0.1, discard the pixel (the pixel won’t be drawn) If pixel.r < 0.2, draw a blue pixel (for the border, although this is optional) else draw a white pixel (the inner part of the water) How to do it using Cocos2d-x and LiquidFun Let’s take the LFParticleSystemNode from Part I, and “evolve” it: The first thing to do is to add the “off-screen” frame-buffer into the LFParticleSystemNode class. In Cocos2d-x, the “off-screen” buffers are created with the RenderTexture class. Example: [code language=“cpp”] bool LFParticleSystemNode::init(b2ParticleSystem* particleSystem, float ratio) { … // create an off-screen frame-buffer with the size of the screen auto s = Director::getInstance()->getWinSize(); _renderTexture = cocos2d::RenderTexture::create(s.width, s.height, Texture2D::PixelFormat::RGBA8888); this->addChild(_renderTexture); _renderTexture->setAnchorPoint(Point::ANCHOR_MIDDLE); _renderTexture->setPosition(Point(s.width/2, s.height/2)); // Change the default shader. Use a the threshold shader auto program = GLProgram::createWithByteArrays(_renderTextureShaderVert, _renderTextureShaderFrag); auto programState = GLProgramState::getOrCreateWithGLProgram(program); programState->setUniformFloat(“u_threshold_discard”, 0.15); programState->setUniformFloat(“u_threshold_border”, 0.3); … } [/code] And, as mentioned earlier, the RenderTexture (the off-screen frame-buffer) needs a shader with a threshold. The threshold shader should look like the following: [code language=“cpp”] varying vec4 v_fragmentColor; varying vec2 v_texCoord; uniform float u_threshold_discard; uniform float u_threshold_border; void main() { vec4 color = v_fragmentColor * texture2D(CC_Texture0, v_texCoord); if( color.r < u_threshold_discard) // black or discard color = vec4(0,0,0,0); else if( color.r < u_threshold_border) // blue for the border color = vec4(0.2,0.2,0.9,1); else // white for the center color = vec4(1,1,1,1); gl_FragColor = color; } [/code] The values u_threshold_discard, and u_threshold_border are defined at runtime. In the example, they are set at 0.15 and 0.3 respectively. The next thing to do is, to render the particles in the RenderTexture. [code language=“cpp”]void LFParticleSystemNode::draw(Renderer *renderer, const Mat4 &transform, uint32_t transformFlags) { // tell RenderTexture to “capture” the particles _renderTexture->beginWithClear(0,0,0,0); _customCommand.init(_globalZOrder); _customCommand.func = CC_CALLBACK_0(LFParticleSystemNode::onDraw, this, transform, transformFlags); renderer->addCommand(&_customCommand); // tell RenderTexture to stop “capturing” the particles _renderTexture->end(); } [/code] The result is the following