Archive for the ‘Uncategorized’ Category

Perforce

Wednesday, January 3rd, 2007

During the primordial days of Stolen Notebook we decided that Subversion would be our source code management system (instead of CVS). Stolen Notebook was all but founded on Subversion. Atomic commits, the freedom to move files and directories, and global version numbers! What more could we want?

It’s hard to know what you want until you’re given it.

During programming SIG at IGDA Madison in November, Perforce was the hot topic in source control. However, at $800 per client we had some excuse for not having tried it.

Someone at Human Head discussed how they had been testing various systems for source control including Perforce and SVN. He thought Perforce and SVN were both excellent choices but Perforce won out because of it’s speed. He mentioned that Perforce is free for two clients, so there was no reason not to try it.

Over the Holidays Tony setup our server with Perforce and modified Trac to browse Perforce. I started using Perforce after I got back from the Holidays. It’s great. Combined with the Perforce SCC plugin for Visual Studio I can do all my coding, updating and checking in within the IDE. The perforce client also has loads of useful information at hand, including what other users have checked out and are working on. It’s a very powerful tool for project management.

Perforce is fast
Subversion used to take 5-10 minutes to checkout the Catharsis source code (about 150Mb). With Perforce it takes about 20 seconds. At one point we had source assets in SVN which took nearly an hour to check in and out. Perforce allows us to have our assets versioned along with source code without long checkout and update times.

Perforce is best on a LAN
The way perforce operates means that clients have to be in constant contact with the server. When a client opens a file for editing the perforce server is told. This is cool because it allows clients to see what files other users are working on. With SVN most communication with the server is done during checkout, update, or checkin. Subversion is great for use on the Internet but Perforce dominates the office. Make sure your internal network is secure, though, because Perforce itself isn’t very secure.

Perforce makes developers happy
So far I have been extremely happy with Perforce on the user end. I’ve heard from Tony that server side there are a few things that make you wonder what the Perforce guys where thinking (plain text passwords come to mind). Thankfully, they are not impossible to work out and the increased boost in productivity makes it well worth it.

Python: Now with less stack

Tuesday, December 19th, 2006

This past week I rediscoverd an article about Stackless Python and decided to try it out. Stackless Python implements continuations, which are hugely powerful for game scripting. I wrote up some simple game objects with Stackless and I am thoroughly impressed.

Others have covered Stackless Python better than I could, so I’ll point you to them. There is a great overview of Stackless Python for games on Thoughts Serializer. From there it’s a quick jump to Introduction to Concurrent Programming using Stackless Python. And, of course, there is always the Stackless website.


Auto API
I have always wanted a scripting system that automatically exposes it’s own API. Years ago I was using Radiant to build geometry for my game engine. Entity properties were defined in an obscure file somewhere with an arbitrary language. The editor used this file to create widgets for the properties of an entity. With the entities hidden away in compiled C code it would be difficult to do any processing to automatically generate this information. It could be done with access to the source code and a custom built parser, but with Python it’s even easier.

Python can interrogate classes for documentation information, method names, and even method arguments. But with a little extra work it is possible to specify information about the arguments beyond what can be gleened from python’s function definitions.

Information such as the range, type, or simple documentation of an argument cannot be specified in a standard python function definition. I created a system that uses doxygen style commands inside method doc strings to add this information. For instance:

def Action_Open(self, rate=1.0):
"""
@doc   Opens the door over time.
@param rate [float]
Rate of speed to open the door.
"""

The documentation system parses this file, grabs the Action functions, and parses doc strings to associate type and documentation with each method parameter. The information can then be used in a tool to create input elements specific to the entity. My goal is to get this information into Maya so that script events can be attached to objects and message handlers written on scene nodes.

Congratulations

Tuesday, December 19th, 2006

Congratulations to Mick Beaver who was hired as Raven Software‘s new Build Process Engineer. It’s a great opportunity, and I’m sure he’ll gain a lot of great experience from his time at Raven. Amongst his myriad new job duties is to “manage data management systems,” which made me titter.

Congratulations, Mick!

Linux, cmake

Thursday, November 23rd, 2006

One thing that has been lax recently is the linux build of Catharsis. I do most of my development on Windows with Visual Studio (who doesn’t) so it’s easy to understand why this has slipped. This has been rectified.

In the beginning there were Makefiles. Writing them sucked. The time and effort required to write Makefiles to build large projects was ridiculous. Then, there was scons, and times were better. However, scons got increasingly sluggish the larger the project became. In addition, scons build files were Python scripts. “The power of Python in a build file,” they said. Soon I found myself writing long, esoteric build scripts that used fancy Python code simply because I could. I now know that this power is unnecessary and ultimately confusing. What was needed was a small set of well structured commands specific to building things. At last, there was cmake.

Tony had heard a lot of praise for cmake and during the holiday I was able to verify that cmake is infact great. Cmake is both simpler and faster than scons. It also does a lot of things you’d expect to be done by default by any modern build system: dependency checking, building in a seperate directory, etc. Perhaps the coolest feature is that, instead of being a build system in and of itself, cmake produces platform specific build files. I haven’t tried the build files produced for Visual C++ yet, but the prospect of having builds for all platforms specified by one set of files is exciting.

Tools Update

Friday, November 10th, 2006

At our last Stolen Notebook team meeting I described the workings of snmaterial and it’s seperated UI system. Denrei immediately asked if I had built snmaterial into Maya. An excellent question. I said no, but I wondered why I hadn’t.

Integrating a scriptable command line tool into your content creation tool is simple. If your content creation tool has a scripting language (most do) and can execute system commands, you’re all set.

At Stolen Notebook we use Maya for content creation. Maya’s scripting language, MEL, can be used to do a variety of things, not least of which is UI scripting. After getting acclimated to MEL’s syntax and UI system I had created a MEL UI interface for snmaterial directly in Maya. The MEL UI interface for snmaterial looks similar to the python interface.

snmaterial in Maya

Creating the UI script in Maya allows direct access to a few useful Maya commands that can be run on the current scene. Most importantly, I can export the current scene to Collada, a feat otherwise impossible without having to run a batch mode of Maya or compile the Maya API into the tool itself. The Material Exporter is accessible from the ‘Catharsis’ menu which is created when our custom Maya plugin is loaded. On export the script does the following:

  • Exports the scene to a temporary collada file
  • Generates snmaterial system command
  • Runs the snmaterial command on the collada file

Chalk up another victory for command line tools.

The CLI

Sunday, November 5th, 2006

I thought it might be interesting to discuss some design considerations of Stolen Notebook Tools.


Goals

One of the most important things to keep in mind when making tools is tool automation. Getting all assets from one end of the asset pipeline to the other with a single command is not only cool, it’s a massive time savings. Tools need to be easy to run from scripts. With scriptable tools it becomes very easy to tie tools together into a pipeline, or run a tool on an entire directory of files, etc.

While tools need to be accessible to scripting, they also need to be easy to for people to use. If a user is going to use a tool directly it must have a UI. Users are more comfortable using a tool with a UI and user error can be reduced by using well known interfaces such as file browsing. Further, a tool with a well designed UI can make the functionality of the tool much more understandable.

So, how does one create a tool that is accessible for scripting yet has a fantastic UI? By breaking the tool into two parts:

  1. A command line program containing the tool’s functionality
  2. A GUI program that generates calls to the tool executable

This structure achieves of all my goals. A tool can be used in a script by executing the command with arguments. The tool also has a nice interface for users.

An example is in order.


SN Material

SN Material exports materials contained in a collada file to Catharsis materials. It also copies all data files (textures, shaders, etc) used by a material into the proper location in the catharsis directory.

The command line interface for SN Material has the following parameters:

-r The Catharsis root directory
-m The module (subdirectory Catharsis root) to export into
-f The Collada file to export materials from
-n The name of the material to export. If this is not present, all materials are exported.

This tool might be run with:

snmaterial -f example.dae -r “f:/catharsis” -m “test”

This exports all materials in example.dae and copies all material data files to the module ‘test’ in f:/catharsis.

A script can easily assemble an snmaterial command. It would be a simple task to create a script that would export all materials from all collada files in a directory. More complicated functionality is not difficult.

In fact, the UI interface is just a script. This script creates a nice interface for the tool, assembling arguments to the command line tool and executing it.

I use Python for scripting and wxPython for creating GUIs. Although, any scripting language and widget library can be used since the UI is only bound to the tool through text commands. The previous command looks like this in the SN Material GUI.

SN Material

As can be seen, this UI is much easier to understand than the command line tool. The user can just run the UI for the tool and will understand the tool much more quickly without having to lookup and memorize command line parameters and flags.

An unexpected benefit of this design is that I can modify and recompile the underlying tool while the UI is still running. While working on SN Material I just leave the UI open, recompile snmaterial.exe and test the exporter with a click.

Xbox…360?!

Tuesday, October 31st, 2006

A few weeks ago Tony and I discovered the co-op mode of Spliter Cell: Chaos Theory. It contained four delicious missions of shadow cloaked tension. Then it was over, all too soon.

I quickly entertained the thought of getting an Xbox 360 and the new Splinter Cell simply to have more co-op missions. My hopes were dashed as I learned that the Xbox360 version has a paltry 3 co-op missions, which are actually dressed up versus missions.

Yesterday news hit me that Splinter Cell: Double Agent for Xbox is completely different than the Xbox360 version. Hope against hope I discovered that the Xbox version contained what I needed, in abundance: 12 co-op missions.

The new co-op missions tie directly into the single player story, giving objectives much more weight and intensity. The game even costs a mere $40. Compare that to the $60 Xbox 360 version which is missing everything I want.

I have postponed my purchase of an Xbox 360 once again.  It won’t be long now though, I pre-ordered Gears of War.  Mind you, it’s the Limited Edition.

I can’t wait to play it in 480i.

Progress: Unabated

Sunday, October 29th, 2006

I’ve been extremely pleased with Collada and the ColladaMaya plugin. It has allowed me to get at all the data I want with much less overhead than using the Maya API to get the same data. I spent a few weeks switching the asset pipeline over to use Collada hoping the time spent would eventually be made up in the future. This last week I was able to add awesome new functionality to the asset pipeline, effectively I have already made back my minor investment in time.


Materials
The focus of the past week has been spent supporting material creation directly in Maya. I wanted to see what ColladaFX was all about. Long story short, ColladaFX is an invaluable tool for working with hardware shaders in Maya.

Using ColladaFX is very simple. Briefly it goes something like this:

  1. Create a ColladaFX material in Maya
  2. Select the Cg vertex and fragment shader files
  3. Assign and manipulate shader parameters

I have been using Cg for hardware shaders in the Engine, so it was especially easy to migrate the shaders into a ColladaFX material. I created a simple tool that finds all the materials in a Collada file and writes Catharsis Engine format materials. It has drastically simplified creating and testing materials. If it works in Maya as a ColladaFX material, it is almost guaranteed to work identically in Catharsis. It sure beats the hell out of editing an XML material definition or spending months creating an interactive shader editor.

An example of a simple distortion shader in Maya using ColladaFX. Between the two images, the range parameter has been increased so the distortion increased.

ColladaFX Distortion Material

A really cool feature of ColladaFX materials is that arbitrary vertex data can be bound to shader parameters. In the above example the map1, Maya’s default texture coordinates, are bound to TEXCOORD0 in the shader. In the shader I then specify that TEXCOORD0 binds to the variable TexCoord0 which is then used in the shader program. I could create a new vertex data set in Maya called ‘distort_intensity’ which would modify how distorted the texture is over the surface. Then I could paint the values of ‘distort_intensity’ on the mesh in Maya. Then I would associate the ‘distort_intensity’ per-vertex data set with TEXCOORD3 and add the functionality in the Cg shaders.

This is excellent, but it would have been really cool for the per-vertex data names to be parsed out of shader file like is done in Material Parameters. The shaders would be self documented, as it wouldn’t be more apparent what the per-vertex data is used for in the shader.


Collision
A couple weeks ago I began to look at adding collision volumes to the Collada asset pipeline. I thought that the physics functionality was a part of the standard ColladaMaya plugin. I quickly learned that Feeling Software had created a completely separate (and MUCH more powerful) Maya plugin for physics called Nima. The Nima plugin offers ways to create and simulate many standard physics objects. It currently supports rigid bodies, cloth, and rag dolls. The physics objects can even be interactively manipulated during simulation (IE: You can grab a skeleton and it’ll flop around as you move it in Maya).

All of it exports to Collada, alleviating my collision volumes export needs. It will be interesting to see what other physics objects I can support in the future.


Boo!
Last night, Halloween festivities were held on State Street. This year tickets were required for State Street and the entire event was shut down promptly at 1:30am.

Though wholly disinterested in the celebrations someone threw a costume at me. In moments I was as seen below. I did some programming so attired, delighting Gavin in the process. The costume lasted for about two hours until I nearly passed out because the hat breathed like a plastic freezer bag.

me.jpg

In case you’re wondering, I am supposed to be The Man with No Name. Turns out I inadvertantly dressed up as a jackass.

noname.jpg
The Man with No Name

Collada Exporter

Monday, October 16th, 2006

Over the weekend I was able to get the Collada exporter to the same state that the old Maya exporter was in. It can export scenes and geometry for use in the engine.

Getting the Collada exporter up to spec took a little more work than the Maya exporter. With the Maya exporter I was able to do very high-level operations on geometry. For instance, I used the obj exporter to compute the world space transforms of a set meshes in one step. There is no such functionality in Collada, but it was simple enough to implement.


Efficient Geometries

While writing the Collada exporter I took the opportunity to modify the mesh architecture in the engine. Previously the exporter collapsed geometry by material. All meshes with the same material would be collapsed into one mesh object. Each mesh stored its own vertex data and index buffer. This is a straightforward approach but is a bit naive. It results in many small vertex buffers and a lot of vertex buffer switching.

An observation I made is that vertex data and material are not explicitly linked. In other words, there is no reason to split up the vertex data by material. On the other hand, the index buffer must be split up by material so that the renderer can stop and bind a new material before drawing the next set of polygons.

The modified architecture splits up vertex and index buffers into Models and Meshes. A Model contains all vertex data and a list of Meshes. A Mesh contains a material and index buffer to index it’s parent Model’s vertex data. Using this architecture all vertex data, regardless of material, is collapsed into one vertex buffer. This system prefers larger vertex buffers and requires much less vertex buffer switching.


Back to Collada
In the end, Collada exporting is much faster. The command line Maya exporter had to dynamicly link with Maya’s HUGE DLLs. This process caused the exporter to stall for 3-5 seconds at startup. This lag time quickly added up when exporting multiple scenes. With the Collada exporter the export is instantaneous.

Things I’m looking forward to:

Collada Refinery
There is a collection of utilities for “conditioning” the data in Collada files. Examples of conditioning include tri-stripping to increase rendering performance. These utilities are collected under the name Collada Refinery.

Collada Physics
Collada handles physics attributes and many collision volumes. Building a physics system into the engine that uses this data would be very interesting.

ColladaFX
Integrated shader parameters and shader code in the Collada files. ColladaMaya even supports rendering using these shaders in Maya views.

Software Craftsman

Friday, October 6th, 2006

I found a great blog post about what motivates software developers and differentiates them from “paycheck programmers”. In summary:

Building software is a very creative and constructive process but the intangible nature of software makes the parallels to traditional engineering difficult… Yet we do still share many of the same feelings and priorities as conventional craftsmen.

  • A tendency towards perfectionism
  • Pride for the end product
  • Strong sense of ownership
  • Criticism of other work
  • Responsibility for flaws
  • Strong affection for tools of the trade (editors, IDEs, utilities, home grown tools)
  • Strong need to use new tools and processes

I couldn’t agree more! Especially with the need to use new tools.

I’d be much less interested in game engine development if the tools and hardware never changed. In this new generation of game consoles people tend to do one of two things: complain about them; or quietly learn to use them. If it distresses you that everything is always changing in the game industry, why do you want to make games?

I have no complaints, and a lot to look forward to.