Go to frontpage


> frontpage
> what is zine?
> download latest issue
> latest articles
> previous issues
> next issue
> contact us
> credits
> bitfellas
Lifeforce Development Journal
by Navis of ASD

I'm Navis, the coder of ASD, and this is my day to day diary of my progress with our latest demo (Lifeforce) destined for Assembly 2007. I started writing it a few months into the project so that I will not bore you with details of the first, lonely days of this demo. After all, nothing too important was happening. The following 7 weeks that I describe represent the core of the development.

A couple of things about me: I have a full time job and recently (almost two months ago) became a father, which means that free time is very difficult to find. I usually try to code the demo before going to work or before going to bed at night.

So, to answer the obvious question (why are you doing this?): I really enjoy making the odd demo now and then, mostly because it is creative and I get to work with some really cool and talented guys in ASD. I like to claim that I've made this demo with my bare hands and this is actually quite true. ASD demos are as handmade as they get. This means that I don't use a scripting tool or editor of some sort: I write hardwired code for the effects, cameras, animations, shaders everything. I feel that this gives me extra freedom. The drawback is that development times might be a bit longer than usual.

"Lifeforce" has been under development since probably December. First, I drafted some ideas - no coding yet - and then updated my graphics libraries. I have added some support for vector graphics / image processing, all taken almost directly from our demo "Evolution of Vision".

I first started coding parts in February. For a demo the size of "Lifeforce" (which will probably be almost as big as "Iconoclast") I normally split it into several parts and worry about transitions later. "Lifeforce" will end up having something like 8-10 different parts, each one lasting from 30 seconds to one minute. In theory, I need to code one part every two weeks otherwise we might run out of time. Thankfully it is still late May and I'm more than halfway there...

20-May

One part down, many still left. There are probably 5 different big tasks left now, 4 of which are new parts. The last one, a biggy, is to put everything together. Without further ado I must start the next part. This is going to feature some quite difficult effects. What is needed is a realistic water coupled with a giant explosion in the distance. My instinct tells me to emulate the explosion with the Navier-Stokes equations for liquids and see what happens.

21-May

Working on a brand new Navier-Stokes-based particle system. Looks like I need thousands of these particles to give volume to my explosion.

22-May

Disaster. While I was right in thinking that the liquid dynamics would produce a 'mushroom' explosion, the end result looks completely artificial and is quite slow. I need to start from scratch and do it the normal way using pre-determined particle systems. Cloud and fire should move up, swirl, fade out. There must also be some sort of 'lighting' of the cloud - if there is a sun above then the smoke should be bright at the top and dark at the bottom. Time to download 'ParticleIllusions 3.0' and study their particle systems. Youtube is also another place to find video-clips of large scale explosions.



23-May

The explosion effect looks quite realistic now. I added some volumetric light coming out of smoke to improve the end result. This whole effect needs to be over moving water - the sea. So there are two obvious problems:

- Making waves move realistically.

- Rendering the water so that it looks good enough for the demo.

The first question will take some time to answer. I could use Perlin noise - could that be good enough? The next step is to write a Perlin noise generator.

24-May

Perlin noise and parallax environmental mapping with reflection works like a charm! I added all that into a shader. It is not very fast on my old geforce fx5600 but hopefully it will run at full framerate at Asm. Now I need to write a tsunami/tidal wave/smoke coming into your face effect. A good starting point is to watch "Second Reality" again - there is a good explosion in the first scene. It looks like it needs a brute force particle system for smoke and fire: the more particles I use the better. Of course, by adding more and more particles I demand more fillrate and the framerate drops. It should be optimized better to find the best balance. I find that the addition of a 'blooming' filter on top helps a lot. However, there is a big dilemma: should the scene end with a fade to white or a fade to black. For that I should ask the other guys in ASD to get some extra opinions.

25-May

The effects of this part are all done. I'll spend the day working on the cameras. Camera programming is quite tedious as I combine many (up to 20 for every minute of content) camera tracks by means of interpolation to acquire a single, perpertual moving viewpoint to highlight my effects. All tracks are hardcoded as pre-defined splines or linear functions over time. The process is soul-consuming so I am planning a break for tomorrow.



26-May

I spent the day with my family and some friends. I cooked a nice dinner to go with the film on TV. Before bed I spend some time studying my motion graphics books.

27-May

Back to that 'almost finished' part. It now plays from start to finish, with cameras all set up perfectly. It lasts 23 seconds. I optimize some of my code by using my own "profiler" -- a simple tool that records the framerate for each frame. In every iteration I remove some particles and see how it visually affects the scene and the framerate. After a few hours of work there is a 15% increase in median framerate. Not much gain for all that trouble but it was a dirty job that somebody had to do.

28-May

It is a holiday today which means I have time to go back and revise some of the visuals before calling it a day with this scene and moving to the next one. I adjust some colors and textures. Unfortunately, ASD's own 2D artist, Amoivikos, is out of commission (he is doing his military service) so I also have to do 2D work. It is quite enjoyable. Today I can improve the look of the fluffy clouds I use all over the place. Since I don't have much time for more development I kkapture the new scene and send it in haste to the rest of the group to see my work and approve. They will be quite happy with the end result. It fits very well with the rest of the demo in terms of continuity. The color scheme (conceptual, like Iconoclast) is still forming. It has a cyclic narrative: there is the element of re-occurrence of visuals in a universe that is evolving. So, for example, the scene I just finished has been revisited at the beginning of the demo. It did look very different but certain elements were there in both occasions. What happens in between and how we get from point A back to point A with a twist is being described in the intermediate scenes. The concept is still quite abstract and we all need to work on finalizing it. There is also some connection to Iconoclast but I'll expand more on that when I reach that point in the future.

In the evening I have a long bath and feel like the warrior coming back from the battle. Another part down, more to come for tomorrow. Problem is, it is difficult to make up my mind on what part to follow with. Maybe finish the really hard stuff first.

29-May

Time for some research on flocking systems, how they work, how they can be optimized. I'm looking for a good solution for emulating the behaviour of a large (2k+) number of elements interacting with each other, supposedly under water. This system must also interact with external objects. I imagine the external objects being the credits and this part being the last one in the whole demo. It's a calm part with low saturation colors, easy music... something like the end of Planet Risk. But the end of Planet Risk was lazy: just an image with a fancy liquid displacement shader on top. This one has to be a real 3D scene with lots happening, like objects swirling and moving and rotating and our names falling in the distance -- debris after a giant explosion.

30-May

The problem now is to emulate a water surface as viewed from inside the sea looking up to the sky. Early in the morning I predict that it should be a simple case of water effect (only viewed upside down) with lots of godrays, rendered with triangles, and some radial blur/volume lights. The fine tuning of the right amount of bump, refraction, and the selection of colors (blue, mostly, but some speculars of white/pink are also nice) takes a couple of hours of my evening. I think that eventually all this work on the surface shader will be in vain as most of it will be obscured by objects and fogging.

31-May

A single camera should do for this effect: A long backtracking shot implying that the viewer is falling in water while looking at the surface. Last thing to add before playing with the colours is the credits. The credits are binary images that are internally vectorized and cast into splines. The splines start off at random points and interpolate to their end position, all in phase. The effect of this is quite interesting: It appears as if thin strands are moving and swirling and eventually construct a message only to dissolve again. I already have the code for that from Lithography (which features a similar effect at the very beginning). The major problem here is space: There are simply too many names and titles to draw: Code, graphics, 3D scanning, music. I think I will let Amoivikos decide on how are we going to do the graphics on this one. He can also decide on the typography - my personal preference is to go for a calligraphic font.

1-June

After a very long night (I went to bed at 3.30 am) I decide that this part is now done. I am quite surprised to find that adding black dot particles has added a lot to the realism of the scene. Indeed it now looks quite gritty and dirty rather than plastic -- perfect. I think that the music for this part should be something mellow to help the viewer wind down and enjoy the credits as they roll. A new addition is a number of objects that appear to float to the surface: a hand holding a broken sphere, which is the broken moon from the previous scene. These objects are rendered in pure black to indicate that they obscure the only light source, which is the sun beyond the water surface.

2-June

Time to move on to another part then. It is a slow day and I have lots to do away from the PC so I don't do much more than preparing the classes for the next part. This will feature a collection of high-poly effects rendered with (a first for ASD) environmental shading!

3-June

I have a long discussion with Amoivikos who shows me his latest OpenGL effect. He is currently doing his military service at the top of a mountain on a Greek island and his only companion is his laptop, internet and good ol' OpenGL. These days he is writing a cloth engine which I might use in the future in the demo. I imagine an effect with banners moving as the camera moves through them.

Later in the afternoon I start the modelling work for the new part. The usual pipeline is: Wings3D for modelling and UV mapping, then Blender for ambient occlusion baking and finally 3D Studio Max for exporting materials and animations. It is quite frustrating doing all this with one arm (the other holding a baby) and having to cope with the abysmal interface of Blender. The end result -- a man climbing a mountain -- looks quite convincing. It is all down to the ambient occlusion of course... I still don't know how I lived without it all these years!

No coding for today, too tired from point-and-click in Wings3D.

4-June

Another waste of a day. I try ambient occlusion on a model with a very peculiar manifold which breaks Blender time after time. After many attempts to reconstruct the manifold I finally give up. The model in question is a room with a hole that leads to an exterior surface. This model will surround my effect. Eventually I realise that I have to either forget about the hole and the exterior or mix two completely different models (one for interior and one for exterior) and hope that they can mix. Eventually I go to bed only to realise that the problem is actually not that difficult (after all many games have a mixture of interior/exterior surface) and I must be very tired in general or getting old fast.

5-June

Woke up at 7am to continue working on the models. When it gets to 8:30am I ride to work where I install Blender to continue what I was doing at home. While everybody is looking and probably wondering what the hell am I doing, I sit back and enjoy watching the ambient occlusion textures getting rendered in perfection. I plan to continue with some demo coding while at the office (and to hell with work!) till it is time to go home around 5pm.

6-June

The scene I am currently working on now has the following elements: A man climbing a mountain, hand stretched, splines coming out of hand and circling a beating heart. The obvious problem is how to model a convincing heart. I get the idea of using my academic connections to acquire a model of a proper heart which has been triangulated using the marching cubes algorithm. The model is about 20k vertices and the good people who worked for me are credited in the demo. Since it is going to be rendered using environmental shading I don't need to add a layer of ambient occlusion -- normal shading will do just as well.

I spend a few hours at work adjusting the model and correcting the few triangles have strange normals. In the evening I set the cameras so that the viewer flies through the mountains and over the man with the stretched hand. The fine tuning of cameras is always a laborious task. I chuckle at the idea that such camera movements might have been almost impossible had I resorted to a 'demo editor'. I firmly believe that a demo like this (and like our previous big productions) simply cannot be created using a demo editor and it can only be done with lots of manual work. Eventually I go to bed at 3am, which is mad because I have to wake up at 7am to look after the baby before going to work (where I will continue demo coding/modelling).



7-June

I keep having these terrible anxiety dreams about demos that crash, parties that go wrong and school exams. I must get some extra sleep because today I feel utterly drained and my head is spinning. I get some work done though: I model some chain links which might be added into the scene at some point in the future. I also write a vertex shader to emulate the heart beat. It might sound like an easy task but it is anything but. The heart muscle movement is very complex indeed. But I have good old youtube to help me: there are plenty of medical training videos and captures of cg beating hearts over there. Another 5 seconds of camera movement are added into the scene. Now I have to find where the main arteries are in space in order to draw my splines. Maybe a good idea would be to use rope physics for the valves and render some as tubes and others as chains.

8-June

Woke up too late for any demo coding before getting to work, but I had a pleasant surprise waiting for me there: The first minute of music from Amusic. All I can say is that I am very pleased with the direction he chose -- it blends really well with the visuals at the beginning of the demo. My plan for today is actually to do some work during the day and later return home to concentrate on some new camera paths and animation for the heart.

9-June

Weekend. Time for either relaxation or intense demo coding. I choose the second. I start with implementing the chains and veins/arteries for the heart. This is very straightforward since I have my own rope physics engine. The veins/arteries have, naturally, two attachments. One will be on the heart itself and the other on the walls of the cubemap. In order to show that the veins/arteries are not static I change the acceleration of gravity as a function of time, using sin/cosines. Now it looks as if wind is blowing through the scene. Another addition in the scene is some growing tubes which come out of the hand of the man and swirl around the heart. I position the camera so that the growth can be seen rather than the end result. In my book this is more important: The journey rather than the destination. I call it a day after mending my cameras. The scene is now 40 seconds but it feels like 10. And this can only be a good thing if your demo is over 7 minutes long..



10-June

I must have spent 5 hours today just fine tuning details such as the colour scheme and the cameras. These cameras seem to take so much time to make from scratch but I firmly believe there is no way around that. Around midday I have a revelation as to how I can introduce the greetings in the demo. What if I have the greetings as strips of text moving over the veins that come out of the heart? I try this and it doesn't work. The text is virtually unreadable. I adjust the projection so that the text is halfway between proper 3D and billboard and suddenly it all clicks together. There is space for exactly 9 names, so if your demo didn't make it blame it on the design - this effect is the definition of form over function! Looks like this part is also heading for completion. I make an avi in haste to show to the rest of the group what has been achieved over the weekend.

11-June

AMusic watches the video. He likes it but suggests that I put some translucency into the model of the heart. Technically not difficult but I am afraid that it will affect the framerate, just when I need to cross-fade into the next scene. I spend all day at work daydreaming and secretly adjusting my heart model in wings3D. I also investigate the potential to use .ai (Adobe illustrator) file format for reading/rendering vector data. But it seems to be too much of a hassle at this stage of the project so I abandon the idea. According to my calculations I should have a first draft of all parts of the demo in exactly 4 weeks from now - on the 9th of July, which is also my birthday. It will be a nice birthday present to myself. AMusic's recommendations mean that I will have to spend another day at least working on the heart part.

12-June

The translucent beating heart now looks much better and the framerate hit is not as massive as I thought it would be. I spend the day away from my com- puter as I go to Manchester to attend a con- ference. I get back home at 9:00 pm, then shove some food down my throat and sleep.

13-June

An early start to the day finds me working on the next scene which will feature a lot of 3D modeling, some L-system fractals and quite a lot of camerawork. The model I want to construct is a bathroom. It features a bath, a sink, a toilet, walls and floor, and a man inside the bath that appears to have died from unnatural causes. The purpose of this scene is to enhance the narrative. The man in the bath (who apparently has committed suicide) relates to all the organic and inorganic elements that have passed before our eyes so far in the demo: the recurring elements of blood (used in many scenes before) and animals/trees swirling over stone cold marble statues. It is his life that passes in front of his eyes, a universe full of antitheses. It is interesting to note that all the scenes with happy colours and movement are cut short by scenes with cold shades and large, static man-made constructions (for example the beating heart is followed by the statue of the praying men).

It is interesting to find out that I can actually undertake the enormous task of carefully modeling a bathroom (with textures and everything) all by myself in one day. Wings3D almost reads my mind and with the help of some reference materials I construct a rather realistic looking bathroom in less than 3 hours from scratch. I spent the rest of the day on blender doing ambient occlusion baking and playing backgammon.

14-June

With the textures in place I start the coding for the scene. First is an idea that Amoivikos had, to introduce the scene by opening two 'curtains'. Lots of work for only 3 seconds of content, but it is good to add little details that viewers will only appreciate with repeated viewings. I try to add some cameras so as to show what is happening in the room: I identify a place to put a message that supposedly was written by the dead man in the bath. More texture work for Amoivikos then, he will be quite pleased.

15-June

A small pause from demo coding to organize one very important effect that will follow: An infinite image zoomer similar to the one used in our old demo Cadence and Cascade. I speak to Archmage on MSN about it and try to agree on content and dimensions of images. In ASD we are all very excited by his contribution -- his graphics are nothing short of spectacular and I cannot wait to see what he has to offer. His zoomer part will be placed just before the polaroids/nuclear explosion part and just after the bath scene. Better organize it while we have time: only 47 days left!

16-June

The scene is static so I need to add some action. My original thought was to have some snakes growing on the floorboard. Instead it is a better idea to have a few black, untextured splines moving from outside the window into the bathroom. I construct the rules of motion for the splines with extra care: They should move, wriggle and avoid obstacles such as the floor and bath, all of which requires a fair amount of coding.

Another thing I try making the floor reflective: this is a technical challenge, as I have to render the scene twice and keep the framerate high. After toiling over the keyboard for two hours I realise that the end result is actually not that much better than how it was before. To make things worse my framerate has dropped quite considerably. I decide to just drop the idea of a reflective floor altogether.

Meanwhile I keep having these weird anxiety dreams about demo parties. In my dreams our demo never quite works and it either crashes or has parts missing! In reality this has only happened once, back in 1996. There and then I learned a good lesson: never wait till the last minute to finish a production. It is more important to spend an extra month working on the little details and making sure that it will run on the supported hardware than getting overly stressed at the party.

17-June

The bathroom scene is now finished. There are small corrections to be made, such as optimizing the splines and fixing some intersection glitches. Amoivikos notices that the shading probably needs some attention as the objects do not reflect any light at all. I spend the rest of the day trying solution after solution, from bump mapping to projected flare- textures. The latter seems to work like a charm: The texture map of the surfaces alter the projection matrix in order to give a more 'diffused' reflection. I like that effect so much that I replace some other shaders scattered around in the demo with this 'diffused reflection' version. Before going to bed I briefly start coding on the next and final part which will be merged with the bath scene. The next scene (code named 'animation', because it will feature heavy use of videos) should take some time to finish. It will only be 40 seconds but I'm willing to add as much detail as possible -- as painful and expensive as that might be.

18-June

Back to work after the weekend. While at my desk I cannot think of anything other than Lifeforce. I briefly do some modelling on Wings3d: A satellite dish that I think will fit in the animation scene at some point in the future. When I go back home I decide that the time has come to record my first animation. My configuration consists of:

- A logitech STX web camera linked through USB to my pc

- Some bluetack to attach the camera to my camera dolly

- The camera dolly which is the baby pushchair - the baby is inside sleeping

- A white wall

- A collection of lamps to light the wall

I become the actor, director, cinematographer, cameraman, light engineer and editor. After about 50 takes I have a very presentable animation which I convert with my own tool into a vectorized animation similar to the ones in the old amiga demo '9 fingers'. I then spend a couple of hours adding the animation into the scene. This scene consists of a long horizontal pan similar to the ones found in Planet risk, Iconoclast and EON, only this one has much faster cameras and is a lot darker and, I believe, more beautiful. My only concern is how is it going to look on the big projector. I have to think about gamma correction at some point.







19-June

6am and I'm adding some black splines into the animation scene. There is me extending my arms and out of the arms come these splines that wrap around objects. I am not quite sure yet whether these lines should have any kind of shading or not. Eventually, after a lot of experimentation, I choose to make them mainly black but appear to get red 'hot' when entering a crack on the plane where the pan takes place. The task of defining these splines by hand is quite laborious. The movement is mainly pro- cedural through the use of sin/cos and random values, but the main turns and twists have to be fine-tuned by hand. My vision for this part is that that splines eventually float over the city (a vector graphic which appears at the beginning of the demo) and shoot off to some clouds that hold the satellite dishes. The dishes, in turn, will beam some rays which will do something else, also involving vector animation, but this time layered in 3D. That should produce some very exciting visuals. I look forward to that.

20-June

I must take it easy now. There is plenty of time to organize this last scene, and because of its weight I must be careful with my steps. At my office I finish the model of the satellite dish in wings3d. Because it is going to be rendered in black there is no need for light baking or other textures. I add the satellites in the scene and, in order to make them look more interesting, I rotate them as soon as the camera closes in. Their movement is very 'robotic' and reminds me of these pesky robots in Impossible Mission for the C64.

21-June

I make the decision to revisit the effect of 'strips coming out of objects' once again. This time the strips are white and come out of the satellite dish. I manually tune the splines so that they rotate about 180' degrees around the satellite and shoot off. The camera upvector keeps changing all the time which gives me a bit of a headache after a few hours of fiddling with this effect.

22-June

While reading my design notes I realize that there is an interesting effect that I haven't used yet and it would be ideal for this part. The effect is rendering the profile of a hand with extended fingers producing more splines. The question now is: How is it possible to link the hand with what came previously, that is, the splines out of the satellite? Easy. Make the satellite splines unite into one thick horizontal line which, in turn, becomes an arm. The camera tracks the arm and reveals that there is a hand at the end. I record my own hand with my webcam, vectorize it and add it to the scene. There is a great deal of per-frame fine tuning in order to fit my wrist (which is not perfectly still) over the horizontal line. To mask the misalignment, I add some sprites of birds on top. Technically and artistically, this addition makes perfect sense!

23-June

Saturday is a day when I have to do lots of housework and other activities inside and outside of the house so I think I'll do as little programming as possible. Those splines look rather bland without the addition of something else: A touch of 'softening' filter to reduce the 'jaggies' and give the appearance of volume. Technically this is pretty trivial: I could do a full screen filter but instead I choose to selectively filter based on splines that run in parallel to the original ones. It takes less than half an hour to do and it already looks a lot more impressive.

24-June

Decision time. What do we do after the hand has produced more splines? Where do these go? While looking for ideas I stumble upon a scan that I made months ago of my palm with extended fingers. I could have each finger producing some more splines. This way the very beautiful Traction-esque effect can keep rolling, morphing from one representation of a 'hand' to another. I add some test splines in haste. I also add some billboarded blobs for lighting effects and fix the cameras. The latter takes most of the time, as usual.

25-June

The evening comes and my plan for today is to do some optimization rather than add more content. Optimization, in this context, is the removal of effects/meshes/particles that the camera has passed already. When I construct a scene I usually add everything together. Eventually the framerate drops to a point where I cannot continue work. I then try to find the elements in the scene that become invisible as time progresses and viewpoints change and remove them. For example:

if (time < 10) ShowBlobs();

In this case I manually find that after time > 10 the Blobs are not visible and, hence, can be removed. This approach is quite crude but it works. What works even better is this:

ShowBlobs (1-max (0,min (1, (ttime-10)*5)));

ShowBlobs takes a value between 0 and 1 that shows how many blobs to render. When time gets over 10 then the number of blobs drop from 1 to 0 within 0.2 sec. This is a much smoother transition between the two states (0 and 1). As a result, the framerate goes up and down smoothly rather than in big steps. There is quite a lot of code dedicated to keeping the framerate as constant as possible. I believe it is very important and a sign of a good demo to respect the framerate.

26-June

Early day today (5:00am) and I'm off to add some content to the scene. First of all I need a dangling lamp to light the first 3 seconds of the scene. This means I need to model a lamp and code some refraction shaders for the glass. All this work for only 3 seconds! Eventually when it is time to go to work at 9am those 3 seconds are totally worth the extra effort. I also add a 'flickering' effect on the lamp, which is probably an old cliche but looks great. Later, I spend my afternoon writing a small set of functions to retrieve the supported video resolutions on a given system. This demo will be a first for ASD in that it can run in really high resolutions. I do regret now that Iconoclast, Planet Risk and all of our other demos are limited to 800x600. My only excuse is that it looked good and fast on my 17' crt monitor so I thought that it would look acceptable everywhere else. How wrong I was!

27-June

What is left now is an interlude section before the major zoom-out effect that will be designed with the help of Archmage. I am still not sure how to approach this. Originally my plan was to put a vector animation in layers, with the camera moving past the layers at an angle so that it looks pseudo-3D. I have to try it and see what that would look like by sitting in front of my webcamera and moving my head around while, at the same time, the image is vectorized and rendered. The results are inconclusive. I can safely say that I will come back to this part of the demo once the graphics from Archmage have been finalized. Only then I can see the broad picture. Maybe I'll just skip that interlude section all together.

28-June

An Idea! What if the vector graphics in that part are not scanned but are part of the original images prepared by Archmage. If he has in his scene, for example, a horse and a clown then he can also make a stylized 2D version in pure Black-and-White, maybe even using some dithering to depict grayscale shading. So, then, the interlude would prepare the viewer for what is about to acquire color and better defined shape. The demo will be giving away the secret of the zoom-out little by little! Besides thinking about this section, I compile a list with all the pending work that I need to do before my deadline, the 9th of July. Lots to do and not much time left. This weekend I have to finish another ASD demo for the Intel competition and next week I'm off to Cologne for work.

29-June

I got massive amounts of work done today. I wake up, as usual, around 6:00am and, without drinking any coffee or tea, I start work. I slowly put all the demo parts together. It is a time consuming task because each time I compile and run I then have to wait for the demo to load from scratch. I write a small plugin to my loading libraries to speed up loading of jpegs: Instead of loading from the disk, my loaders are simply creating a small texture with some random noise. I can still see what is going on in the scene (the geometry is still there) and I can work a lot faster. The loading time drops from 15-20 seconds to just a couple of seconds. By the end of the day I manage to link 3 parts together. I worked sequentially, meaning that I linked together the first three parts of the demo. Linking the first and second part was quite difficult as there is a period of time when both parts are playing at the same time. This is quite dangerous as it can drop the framerate or produce stalls as the texture cache is being updated. I write a small function to do cache warming on the fly and everything now flows without stops or framerate spikes. I have seen the first 2 minutes of our demo and that makes me happy.

30-June

Very early in the morning I finish with 'Beyond the walls of Eryx', submit it to scene.org and pouet and I continue with linking parts together. There is another part where I have to consider running two scenes at the same time. I waste a lot of effort trying to make the second scene appear through a stencil window only to find out that I can just alpha blend the two rendered-to-textures of the two scenes together and the end result is much better looking, and probably as fast as what I would get through stencil trickery. I also compile an executable version for Amusic with 80 seconds of music fully synchronized with the visuals. Synchronization is relatively easy but can get a bit tedious with nearly 8 minutes of content; Iconoclast tired me considerably, as there were tons of synchronization points to be taken into account.

1-July

My goal for today is to have all but two of the scenes linked together. Bugs appear all the time. Some times, one scene doesn't seem to particularly like the one that it is linked to. This is usually caused by changing some parameters of the OpenGL state machine in one scene and that change propagates into the next one with unpredicted consequences. But once I find the bugs it is very simple to fix them. I need to consult my old videos of all parts recorded separately because I have forgotten exactly how each scene is supposed to look. "Did this cloud have this color?" I find my self wondering rather often.

2-July

All of the demo is there with the exception the last two scenes (nuclear explosion and underwater credits) which will be linked once the graphics have been finalized. I am still working on fixing transitions -- not their content but how they affect the framerate. What I usually do is remove particles/objects/calculations by the end of the scene and do the reverse (that is, introduce elements) in the scene that is following. The theory is that, as they overlap, the cpu and gpu demands should ideally stay the same. This is hard to do while keeping the content fixed, but that is the only option I have at the moment. I spend about 3 hours of my morning fixing a 1.50 second gap. Well worth it if you ask me.

3-7 July

No time for demos as I'm in Germany. Old TRSI demo people, Cologne cathedral, currywursts, I've seen and done them all.



8 July

Back in Britain and I realize that the demo is now eating 500mb of ram and vram, which is completely unacceptable. Clearly I need to employ some sort of compression for the textures, or even reduce the size of the textures before loading. The latter is easy: Indeed, most of the lightmaps used have an enormous resolution. Cutting that by half reduces that amount by 50mb and the footprint of the zipped demo by a further 3mb. I write a wrapper for OpenGL texture compression. I can see that the quality in a couple of textures has dropped but on the other hand the demo now needs less than 200mb to run. The framerate is also higher (an increase of about 15%) and the stalls between scenes have been completely eradicated. Because of the framerate increase the overall feeling has improved quite a bit. With some great music from Amusic and Leviathan we can make this demo work! What started as a collection of effects is now becoming a demo.



9-July

With everything in place I can now sit back and enjoy the rest of the ride. I am waiting for the music and graphics and I know that the good people in Greece, UK and Norway are working their asses off to finish what they have started. I wonder how the demo will look like in 2 weeks time. Will we run out of time or will everything be ok, as usual? Today is my birthday and I have almost reached my goal of having all the content ready and waiting by today. Furthermore, I managed to link all of the parts together. It is now time for some very careful fine tuning, synchronization and final touches. We are three and a half weeks away from the event, and I want to believe that everything is under control.

- Navis/ASD


Go back to articlelist

Comments: (click here to comment the article)

Hosting provided by Mythic-Beasts. See the Hosting Information