Development on Doom Classic iOS

By John Carmack, Technical Director, Id Software (2009)
http://www.idsoftware.com/doom-classic/doomdevelopment.htm

Way back in March when I released the source for Wolfenstein 3D Classic, I said that Doom Classic would be coming „real soon“, and on April 27, I gave a progress report:
http://www.idsoftware.com/iphone-doom-classic-progress/
I spent a while getting the multiplayer functionality up, and I figured I only had to spend a couple days more to polish things up for release.

However, we were finishing up the big iPhone Doom Resurrection project with Escalation Studios, and we didn’t want to have two Doom games released right on top of each other, so I put Doom Classic aside for a while.  After Doom Resurrection had its time in the sun, I was prepared to put the rest of the work into Doom Classic, but we ran into another schedule conflict.  As I related in my Wolf Classic notes http://www.idsoftware.com/wolfenstein-3d-classic-platinum/wolfdevelopment.htm , Wolfenstein RPG for the iPhone was actually done before Wolfenstein Classic, but EA had decided to sit on it until the release of the big PC / console Wolfenstein game in August.

I really thought I was going to go back and finish things up in September, but I got crushingly busy on other fronts.  In an odd little bit of serendipity, after re-immersing myself in the original Doom for the iPhone, I am now working downstairs at Id with the Doom 4 team.  I’m calling my time a 50/50 split between Rage and Doom 4, but the stress doesn’t divide.  September was also the month that Armadillo Aerospace flew the level 2 Lunar Lander Challenge:
Finally, in October I SWORE I would finish it, and we aimed for a Halloween release.  We got it submitted in plenty of time, but we ran into a couple approval hiccups that caused it to run to the very last minute.  The first was someone incorrectly thinking that the „Demos“ button that played back recorded demos from the game, was somehow providing demo content for other commercial products, which is prohibited.  The second issue was the use of an iPhone image in the multiplayer button, which we had to make a last minute patch for.

Release notes

Ok, the game is finally out (the GPL source code is being packaged up for release today).  Based on some review comments, there are a couple clarifications to be made:

Multiplayer requires a WiFi connection that doesn’t have UDP port 14666 blocked.  I’m quite happy with the simple and fast multiplayer setup, but it seems like many access points just dump the packets in the trash.  If the multiplayer button on the main menu doesn’t start pulsing for additional players after the first player has hit it, you won’t be able to connect.  I have also seen a network where the button would pulse, but the player would never get added to the player list, which meant that somehow the DNS packets were getting through, but the app packets weren’t.  It works fine on a normal AirPort install…  More on networking below.

I took out tilt-to-turn just to free up some interface screen space, because I didn’t know anyone that liked that mode, and my query thread on Touch Arcade didn’t turn up people that would miss it a lot.

Evidently there are a few people that do care a lot, so we will cram that back in on the next update.  The functionality is still there without a user interface, so you can enable it by four-finger-tapping to bring up the keyboard and typing „tiltturn 4000“ or some number like that, and it will stay set.  Make sure you have tiltmove pulled down to 0.  I never got around to putting in a real console, but you can change a few parameters like that, as well as enter all the original doom cheat codes like IDDQD, IDKFA, etc.

I think that the auto-centering control sticks in Doom Classic are a better control scheme than the fixed sticks from Wolf Classic.  The advice for wolf was to adjust the stick positions so that your thumbs naturally fell in the center point, so I just made that automatic for Doom.  Effective control always involved sliding your thumbs on the screen, rather than discretely tapping it, and this mode forces you to do that from the beginning.
Still, even if the new mode is some fraction „better“, there are a lot of people who have logged a lot of hours in Wolfenstein Classic, and any change at all will be a negative initially.  In the options->settings menu screen, there is a button labeled „Center sticks: ON“ that can be toggled off to keep the sticks fixed in place like in Wolf.

A subtle difference is that the turning sensitivity is now graded so that a given small movement will result in a specific percentage increase in speed, no matter where in the movement range it is.  With linear sensitivity, if you are 10 pixels off from the center and you move your thumb 10 pixels farther, then the speed exactly doubles.  If you are 50 pixels off from the center, the same 10 pixel move only increases your turning rate by 20%.  With ramped sensitivity, you would get a 20% (depending on the sensitivity scale) increase in speed in both cases, which tends to be better for most people.  You can disable this by toggling the „Ramp turn: ON“ option off.

In hindsight, I should have had a nice obvious button on the main options screen that said „Wolfenstein Style“ and had the same options, but I have always had difficult motivating myself to do good backwards compatibility engineering.  Even then, the movement speeds are different between the games, so it wouldn’t have felt exactly the same.

It was a lot of fun to do this project, working on it essentially alone, as a contrast to the big teams on the major internal projects.  I was still quite pleased with how the look and feel of the game holds up after so long, especially the „base style“ levels.  The „hell levels“ show their age a lot more, where the designers were really reaching beyond what the technology could effectively provide.

Future iPhone work

We do read all the reviews in the App store, and we do plan on supporting Doom Classic with updates.  Everything is still an experiment for us on the iPhone, and we are learning lessons with each product.  At this point, we do not plan on making free lite versions of future products, since we didn’t notice anything worth the effort with Wolfenstein, and other developers have reported similar findings.

We have two people at Id that are going to be dedicated to iPhone work.  I doubt I will be able to personally open Xcode again for a few months, but I do plan on trying to work out a good touch interface for Quake Classic and the later 6DOF games.  I also very much want to make at least a tech demo that can run media created with a version of our idTech 5 megatexture content creation pipeline.  I’m not sure exactly what game I would like to do with it, so it might be a 500 mb free gee-whiz app…

Wolfenstein Classic Platinum was a break-in opportunity for the new internal iPhone developers.  We were originally planning on making the Spear of Destiny levels available as in-app purchased content.  Then we decided to make it a separate „Platinum Edition“ application at a reasonable price.  Finally, we decided that we would just make it a free update, but something has gone wrong during this process — people who buy the app for the first time get everything working properly, but many people who upgrade the App from a previous purchase are seeing lots of things horribly broken.  We are working with Apple to try to debug and fix this, but the workaround is to uninstall the app completely, then reload it.  The exciting thing about Wolf Platinum is the support for downloadable levels, which is the beta test for future game capabilities.  Using a URL to specify downloadable content for apps is a very clean way to interface to the game through a web page or email message.

The idMobile team is finishing up the last of the BREW versions of Doom 2 RPG, and work has started on an iPhone specific version, similar to the Wolfenstein RPG release.  The real-time FPS games are never going to be enjoyable for a lot of people, and the turn based RPG games are pretty neat in many regards.  If they are well received, we will probably bring over the Orcs&Elves games as well.

I want to work on a Rage themed game to coincide with Rage’s release, but we don’t have a firm direction or team chosen for it.  I was very excited about doing a really-designed-for-the-iPhone first person shooter, but at this point I am positive that I don’t have the time available for it.

Networking techie stuff

I doubt one customer in ten will actually play a network game of Doom Classic, but it was interesting working on it.

Way back in March when I was first starting the work, I didn’t want the game to require 3.0 to run, and I generally try to work with the lowest level interfaces possible for performance critical systems, so I wasn’t looking at GameKit for multiplayer.  I was hoping that it was possible to use BSD sockets to allow both WiFi networking on 2.0 devices and WiFi or ad-hoc bluetooth on 3.0 devices.  It turns out that it is possible, but it wasn’t documented as such anywhere I could find.

I very much approve of Apple’s strategy of layering Obj-C frameworks on top of Unix style C interfaces.  Bonjour is a layer over DNS, and GameKit uses sockets internally.  The only bit of obscure magic that goes on is that the bluetooth IP interface only comes into existence after you have asked DNS to resolve a service that was reported for it.  Given this, there is no getting around using DNS for initial setup.

With WiFi, you could still use your own broadcast packets to do player finding and stay completely within the base sockets interfaces, and this might even make some sense, considering that there appear to be some WiFi access points that will report a DNS service’s existence that your app can’t actually talk to.

For every platform I have done networking on previously, you could pretty much just assume that you had the loopback interface and an Ethernet interface, and you could just use INADDR_ANY for pretty much everything.  Multiple interfaces used to just be an issue for big servers, but the iPhone can have a lot of active interfaces — loopback, WiFi Ethernet, Bluetooth Ethernet, and several point to point interfaces for the cellular data networks.

At first, I was excited about the possibility of multiplayer over 3G.  I had been told by someone at Intel that they were seeing ping times of 180 ms on 3G devices, which could certainly be made to work for gaming.

Unfortunately, my tests, here in Dallas at least, show about twice that, which isn’t worth fighting.  I’m a bit curious whether they were mistaking one-way times, or if the infrastructure in California is really that much better.  In any case, that made my implementation choice clear — local link networking only.

A historical curiosity: the very first release of the original Doom game on the PC used broadcast IPX packets for LAN networking.  This seemed logical, because broadcast packets for a network game of N players has a packet count of just N packets on the network each tic, since everyone hears each packet.  The night after we released the game, I was woken up by a call from a college sysadmin yelling at me for crippling their entire network.  I didn’t have an unlisted number at the time.  When I had decided to implement network gaming, I bought and read a few books, but I didn’t have any practical experience, so I had thought that large networks were done like the books explained, with routers connecting independent segments.  I had no idea that there were many networks with thousands of nodes connected solely by bridges that forwarded all broadcast packets over lower bit rate links.  I quickly changed the networking to have each peer send addressed packets to the other peers.  More traffic on the local segment, but no chance of doing horrible things to bridged networks.

WiFi is different from wired Ethernet in a few ways.  WiFi clients don’t actually talk directly to each other, they talk to the access point, which rebroadcasts the packet to the destination, so every packet sent between two WiFi devices is actually at least two packets over the air.

An ad-hoc WiFi network would have twice the available peer to peer bandwidth and half the packet drop rate that an access point based one does.  Another point is that unlike wired Ethernet, the WiFi link level actually does packet retransmits if the destination doesn’t acknowledge receipt.  They won’t be retransmitted forever, and the buffer spaces are limited, so it can’t be relied upon the way you do TCP, but packet drops are more rare than you would expect.  This also means that there are lots of tiny little ACK packets flying around, which contributes to reduced throughput.  Broadcast packets are in-between — the broadcast packet is sent from the source to the access point with acknowledgment and retransmits, but since the access point can’t know who it is going to, it just fires it out blind a single time.

I experimentally brought the iPhone networking up initially using UDP broadcast packets, but the delivery was incredibly poor.  Very few packets were dropped, but hundreds of milliseconds could sometimes go by with no deliveries, then a huge batch would be delivered all at once.  I thought it might be a policy decision on our congested office access point, but it behaved the same way at my house on a quiet LAN, so I suspect there is an iPhone system software issue.  If I had a bit more time, I would have done comparisons with a WiFi laptop.  I had pretty much planned to use addressed packets anyway, but the broadcast behavior was interesting.

Doom PC was truly peer to peer, and each client transmitted to every other client, for N * (N-1) packets every tic.  It also stalled until valid data had arrived from every other player, so adding more players hurts in two different ways — more packets = more congestion = more likely to drop each individual packet.  The plus side of an arrangement like this is that it is truly fair, no client has any advantage over any other, even if one or more players are connected by a lower quality link.  Everyone gets the worst common denominator behavior.

I settled on a packet server approach for the iPhone, since someone had to be designated a „server“ anyway for DNS discovery, and it has the minimum non-broadcast packet count of 2N packets every tic.  Each client sends a command packet to the server each tic, the server combines all of them, then sends an addressed packet back to each client.  The remaining question was what the server should do when it hasn’t received an updated command from a client.  When the server refused to send out a packet until it had received data from all clients, there was a lot more variability in the arrival rate.  It could be masked by intentionally adding some latency on each client side, but I found that it plays better to just have the server repeat the last valid command when it hasn’t gotten an update.  This does mean that there is a slight performance advantage to being the server, because you will never drop an internal packet.

The client always stalls until it receives a server packet, there was no way I had the time to develop any latency reducing / drop mitigating prediction mechanisms.  There are a couple full client / server, internet capable versions of Doom available on the PC, but I wanted to work from a more traditional codebase for this project.

So, I had the game playing well over WiFi, but communication over the Bluetooth interface was significantly worse.  There was an entire frame of additional latency versus WiFi, and the user mode Bluetooth daemon was also sucking up 10% of the CPU.  That would have been livable, but there were regular surges in the packet delivery rate that made it basically unplayable.

Surging implies a buffer somewhere backing up and then draining, and I had seen something similar but less damaging occasionally on WiFi as well, so I wondered if there might be some iPhone system communication going on.  I spent a little while with WireShark trying to see if the occasional latency pileup was due to actual congestion, and what was in the packets, but I couldn’t get my Macbook into promiscuous WiFi mode, and I didn’t have the time to configure a completely new system.

In the end, I decided to just cut out the Bluetooth networking and leave it with WiFi.  There was a geek-neatness to having a net game with one client on WiFi and another on Bluetooth, but I wasn’t going to have time to wring it all out.

John Carmack on DOOM Classic Development, Fan Questions

05.11.2009 via Bethblog.com

Now that DOOM Classic has been released on iTunes, John Carmack has written his development notes on the project which you can read here.

This week we took some questions for John from the id Software twitter feed, and you will find his answers after the jump.

Khct: What was the most fun and/or rewarding part of developing DOOM Classic?
Porting a game is a very different experience from developing a new game. With a port, after working for a while, there is a moment when BANG — the game is there. After that, it is mostly a matter of fixing problems, optimizing, and polishing the experience. This project was especially rewarding because I felt that I was being extremely productive on the days that I was working on it — I was probably the best person in the world to do that work at that time, and I was knocking out issues left and right.

Matthaigh: Did you encounter any problems when porting the code to iPhone/touch? Such as APIs you used?
It went very smoothly. The prBoom codebase that I based it on already compiled for OS X, so there wasn’t much grunt work, and I had all the device specific IO code that I developed for Wolfenstein Classic. Being able to take advantage of the GPL code that other people have maintained and improved over the years has been very satisfying for me. I always argued that we got worthwhile intangible benefits from my policy of releasing the source code to the older games, but with Wolfenstein Classic and DOOM Classic I can now point to significant amounts of labor that I was personally saved. In fact, the products probably never would have existed at all if my only option was to work from the original “dusty deck” source code for the games. If we were even able to find the original code at all. Hooray for open source!

Seventhcycle: What sort of synth did you use for DOOM Classic’s music? It sounds like AWE32 or AWE64. Why not use something like a SDD4?

The following answer is from Christian Antkow, Aural Assault Technician, id Software

Where possible, I pulled Redbook Audio off Bobby Prince’s archival CD’s and I can only assume it was created with a Creative AWE32 or some other comparable relic back in the day. Where I did not have original Redbook audio to work with, I went back to the original MIDI’s and ran them through a modern Creative X-Fi General MIDI module in an attempt to match the original Redbook audio as closely as possible. My original intent was to use the East/West Colossus Virtual Instrument in my DAW, and run all the MIDI through their GM instruments to generate completely new high quality renderings. When I did so with the first couple tracks, it just sounded way too “real” and all sense of nostalgia was lost.

So, that’s really the main reason I didn’t redo everything in a modern instrument. All sense of nostalgia would be lost and I felt that needed to be retained.

Kevinquilen: After all these years, how do you feel that people are still interested in DOOM and Wolf 3-D up against newer games?
Nostalgia undoubtedly plays a part, but I do think that the games hold there own and then some for playability. The iPhone market is full of titles that are heavy on aesthetics and light on quality gameplay. There is a LOT of gameplay in the old titles, and I spent a significant amount of effort to make sure that they play well on the new platform.

Thanks to all of the Twitter followers who submitted their questions, and of course John and Christian for answering them!

Orcs & Elves – John Carmack’s Blog

Orcs & Elves – John Carmack’s Blog

May 2nd, 2006 – via armadilloaerospace.com

I’m not managing to make regular updates here, but I’ll keep this around just in case. I have a bunch of things that I want to talk about — some thoughts on programming style and reliability, OpenGL, Xbox 360, etc, but we have a timely topic with the release of our second mobile game, Orcs & Elves, that has spurred me into making this update.

DoomRPG, our (Id Software’s and Fountainhead Entertainment’s) first mobile title, has been very successful, both in sales and in awards. I predict that the interpolated turn based style of 3D gaming will be widely adopted on the mobile platform, because it plays very naturally on a conventional cell phone. Gaming will be a lot better when there is a mass market of phones that can be played more like a gamepad, but you need to make do with what you actually have.

One of the interesting things about mobile games is that the sales curve is not at all like the drastically front loaded curve of a PC or console game. DoomRPG is selling better now than when it was initially released, and the numbers are promising for supporting additional development work. However, unless I am pleasantly surprised, the hardware capabilities are going to advance much faster than the market in the next couple years, leading to an unusual situation where you can only afford to develop fairly crude games on incredibly powerful hardware. Perhaps „elegantly simple“ would be the better way of looking at it, but it will still wind up being like developing an Xbox title for $500,000. That will wind up being great for many small game companies that just want to explore an idea, but having resource far in excess of your demands does minimize the value of being a hot shot programmer. :-)

To some degree this is already the case on high end BREW phones today. I have a pretty clear idea what a maxed out software renderer would look like for that class of phones, and it wouldn’t be the PlayStation-esq 3D graphics that seems to be the standard direction. When I was doing the graphics engine upgrades for BREW, I started along those lines, but after putting in a couple days at it I realized that I just couldn’t afford to spend the time to finish the work. „A clear vision“ doesn’t mean I can necessarily implement it in a very small integral number of days. I wound up going with a less efficient and less flexible approach that was simple and robust enough to not likely need any more support from me after I handed it over (it didn’t).

During the development of DoomRPG, I had commented that it seemed obvious that it should be followed up with a „traditional, Orcs&Elves sort of fantasy game“. A couple people independently commented that „Orcs&Elves“ wasn’t a bad name for a game so since we didn’t run into any obstacles, Orcs& Elves it was. Naming new projects is a lot harder than most people think, because of trademark issues.

In hindsight, we made a strategic mistake at the start of O&E development. We were fresh off the high end BREW version of DoomRPG, and we all liked developing on BREW a lot better than Java. It isn’t that BREW is inherently brilliant, it just avoids the deep sucking nature of java for resource constrained platforms (however, note the above about many mobile games not being resource constrained in the future), and allows you to work inside visual studio. O&E development was started high-end first with the low-end versions done afterwards. I should have known better (Anna was certainly suspicious), because it is always easier to add flashy features without introducing any negatives than it is to chop things out without damaging the core value of a game. The high end version is really wonderful, with all the graphics, sound, and gameplay we aimed for, but when we went to do the low end versions, we found that even after cutting the media as we planned, we were still a long way over the 280k java application limit. Rather than just butchering it, we went for pain, suffering, and schedule slippage, eventually delivering a game that still maintained high quality after the de-scoping (the low end platforms still represent the majority of the market). It would have been much easier to go the other way, but the high end phone users will be happy with our mistake.

DoomRPG had three base platforms that were customized for different phones — Java, low end BREW, and high end BREW. O&E added a high end java version that kept most of the quality of the high end BREW version on phones fast enough to support it from carriers willing to allow the larger download. The download size limits are probably the most significant restriction for gaming on the high end phones. I don’t really understand why the carriers encourage streaming video traffic, but balk at a couple megs of game media.

I am really looking forward to the response to Orcs&Elves, because I think it is one of the best product evolutions I have been involved in. The core game play mechanics that were laid out in DoomRPG have proven strong and versatile (again, I bet we have a stable genre here), but now we have a big bag of tricks and a year of polishing the experience behind us, along with a world of some depth. I found it a very good indicator that play testers almost always lost track of time while playing.

This project was doubly nostalgic for me — the technology was over a decade old for me, but the content took me back twenty years. All the computer games I wrote in high school were adventure games, and my first two commercial sales were Ultima style games for the Apple II, but Id Software never got around to doing one. Old timers may recall that we were going to do a fantasy game called „The Fight For Justice“ (starring a hero called Quake…) after Commander Keen, but Wolfenstein 3D and the birth of the FPS sort of got in the way. :-)

John Carmack

Cell phone adventures – John Carmack’s Blog

Cell phone adventures – John Carmack’s Blog

March 27th, 2005 – via armadilloaerospace.com

I’m not a cell phone guy. I resisted getting one at all for years, and even now I rarely carry it. To a first approximation, I don’t really like talking to most people, so I don’t go out of my way to enable people to call me. However, a little while ago I misplaced the old phone I usually take to Armadillo, and my wife picked up a more modern one for me. It had a nice color screen and a bunch of bad java game demos on it. The bad java games did it.

I am a big proponent of temporarily changing programming scope every once in a while to reset some assumptions and habits. After Quake 3, I spent some time writing driver code for the Utah-GLX project to give myself more empathy for the various hardware vendors and get back to some low-level register programming. This time, I decided I was going to work on a cell phone game.

I wrote a couple java programs several years ago, and I was left with a generally favorable impression of the language. I dug out my old “java in a nutshell” and started browsing around on the web for information on programming for cell phones. After working my way through the alphabet soup of J2ME, CLDC, and MIDP, I’ve found that writing for the platform is pretty easy.

In fact, I think it would be an interesting environment for beginning programmers to learn on. I started programming on an Apple II a long time ago, when you could just do an “hgr” and start drawing to the screen, which was rewarding. For years, I’ve had misgivings about people learning programming on Win32 (unix / X would be even worse), where it takes a lot of arcane crap just to get to the point of drawing something on the screen and responding to input. I assume most beginners wind up with a lot of block copied code that they don’t really understand.

All the documentation and tools needed are free off the web, and there is an inherent neatness to being able to put the program on your phone and walk away from the computer. I wound up using the latest release of NetBeans with the mobility module, which works pretty well. It certainly isn’t MSDev, but for a free IDE it seems very capable. On the downside, MIDP debugging sessions are very flaky, and there is something deeply wrong when text editing on a 3.6 ghz processor is anything but instantaneous.

I spent a while thinking about what would actually make a good game for the platform, which is a very different design space than PCs or consoles. The program and data sizes are tiny, under 200k for java jar files. A single texture is larger than that in our mainstream games. The data sizes to screen ratios are also far out of the range we are used to. A 128x128x16+ bit color screen can display some very nice graphics, but you could only store a half dozen uncompressed screens in your entire size budget. Contrast with PCs, which may be up to a few megabytes of display data, but the total game data may be five hundred times that.

You aren’t going to be able to make an immersive experience on a 2” screen, no matter what the graphics look like. Moody and atmospheric are pretty much out. Stylish and fun is about the best you can do.

The standard cell phone style discrete button direction pad with a center action button is a good interface for one handed navigation and selection, but it sucks for games, where you really want a game boy style rocking direction pad for one thumb, and a couple separate action buttons for the other thumb. These styles of input are in conflict with each other, so it may never get any better. The majority of traditional action games just don’t work well with cell phone style input.

Network packet latency is bad, and not expected to be improving in the foreseeable future, so multiplayer action games are pretty much out (but see below).

I have a small list of games that I think would work out well, but what I decided to work on is DoomRPG – sort of Bard’s Tale meets Doom. Step based smooth sliding/turning tile movement and combat works out well for the phone input buttons, and exploring a 3D world through the cell phone window is pretty neat. We talked to Jamdat about the business side of things, and hired Fountainhead Entertainment to turn my proof-of-concept demo and game plans into a full-featured game.

So, for the past month or so I have been spending about a day a week on cell phone development. Somewhat to my surprise, there is very little internal conflict switching off from the high end work during the day with gigs of data and multi-hundred instruction fragment shaders down to texture mapping in java at night with one table lookup per pixel and 100k of graphics. It’s all just programming and design work.

It turns out that I’m a lot less fond of Java for resource-constrained work. I remember all the little gripes I had with the Java language, like no unsigned bytes, and the consequences of strong typing, like no memset, and the inability to read resources into anything but a char array, but the frustrating issues are details down close to the hardware.

The biggest problem is that Java is really slow. On a pure cpu / memory / display / communications level, most modern cell phones should be considerably better gaming platforms than a Game Boy Advanced. With Java, on most phones you are left with about the CPU power of an original 4.77 mhz IBM PC, and lousy control over everything.

I spent a fair amount of time looking at java byte code disassembly while optimizing my little rendering engine. This is interesting fun like any other optimization problem, but it alternates with a bleak knowledge that even the most inspired java code is going to be a fraction the performance of pedestrian native C code.

Even compiled to completely native code, Java semantic requirements like range checking on every array access hobble it. One of the phones (Motorola i730) has an option that does some load time compiling to improve performance, which does help a lot, but you have no idea what it is doing, and innocuous code changes can cause the compilable heuristic to fail.

Write-once-run-anywhere. Ha. Hahahahaha. We are only testing on four platforms right now, and not a single pair has the exact same quirks. All the commercial games are tweaked and compiled individually for each (often 100+) platform. Portability is not a justification for the awful performance.

Security on a cell phone is justification for doing something, but an interpreter isn’t a requirement – memory management units can do just as well. I suspect this did have something to do with Java’s adoption early on. A simple embedded processor with no MMU could run arbitrary programs securely with java, which might make it the only practical option. However, once you start using blazingly fast processors to improve the awful performance, a MMU with a classic OS model looks a whole lot better.

Even saddled with very low computing performance, tighter implementation of the platform interface could help out a lot. I’m not seeing very conscientious work on the platforms so far. For instance, there is just no excuse for having 10+ millisecond granularity in timing. Given that the java paradigm is sort of thread-happy anyway, having a real scheduler that Does The Right Thing with priorities and hardware interfacing would be an obvious thing. Pressing a key should generate a hardware interrupt, which should immediately activate the key listening thread, which should be able to immediately kill an in-process rendering and restart another one if desired. The attitude seems to be 15 msec here, 20 there, stick it on a queue, finish up a timeslice, who cares, right?

I suspect I will enjoy working with BREW, the competing standard for cell phone games. It lets you use raw C/C++ code, or even, I suppose, assembly language, which completely changes the design options. Unfortunately, they only have a quarter the market share that the J2ME phones have. Also, the relatively open java platform development strategy is what got me into this in the first place – one night I just tried writing a program for my cell phone, which isn’t possible for the more proprietary BREW platform.

I have a serious suggestion for the handset designers to go with my idle bitching. I have been told that fixing data packet latency is apparently not in the cards, and it isn’t even expected to improve much with the change to 3G infrastructure. Packet data communication seems more modern, and has the luster of the web, but it is worth realizing that for network games and many other flashy Internet technologies like streaming audio and video, we use packets to rather inefficiently simulate a switched circuit.

Cell phones already have a very low latency digital data path – the circuit switched channel used for voice. Some phones have included cellular modems that use either the CSD standard (circuit switched data) at 9.8Kbits or 14.4Kbits or the HSCSD standard (high speed circuit switched data) at 38.4Kbits or 57.6Kbits. Even the 9.8Kbit speed would be great for networked games. A wide variety of two player peer-to-peer games and multiplayer packet server based games could be implemented over this with excellent performance. Gamers generally have poor memories of playing over even the highest speed analog modems, but most of the problems are due to having far too many buffers and abstractions between the data producers/consumers and the actual wire interface. If you wrote eight bytes to the device and it went in the next damned frame (instead of the OS buffer, which feeds into a serial FIFO, which goes into another serial FIFO, which goes into a data compressor, which goes into an error corrector, and probably a few other things before getting into a wire frame), life would be quite good. If you had a real time scheduler, a single frame buffer would be sufficient, but since that isn’t likely to happen, having an OS buffer with accurate queries of the FIFO positions is probably best. The worst gaming experiences with modems weren’t due to bandwidth or latency, but to buffer pileup.

Live @ QuakeCon (Update – Item)

Auszüge aus John Carmacks Rede:

  • Doom 3 verwendet eine sehr komplexe Scripting Language
  • Es soll sogar mehr interaktive Elemente als Buttons geben… eine große Rolle spielt dabei die Echtzeit-Lichtberechnung. Designer verwenden Licht nun nicht einfach um bestimmte Teile der Karte aufzuhellen, sondern es bekommt eine dramaturgische Funktion, was ein Umdenken der Designer erforderte.
  • Konsistenteste Graphikengine aller Zeiten. Lichteffekte in anderen Spielen basieren auf ´billigen Tricks´ (O-Ton Carmack), die allerdings in bestimmten Umgebungen besser sind als eine komplexe Gesamtstruktur. Carmack allerdings vermeidet nach eigenen Angaben diese Hacks für Doom, wodurch eine noch nie gesehene Konsistenz in allen Bereichen entsteht.
  • Seine nächste Engine wird eine High Level Shading Language verwenden, auch wenn einige Treiber damit große Probleme haben… allerdings sollten dadurch auch Effekte nahezu in Filmqualität möglich werden. Bei der nächsten Engine werden die Grafikkartentreiber sein Spiel unterstützen müssen und nicht sein Spiel die Treiber, so Carmack.
  • Eine seiner schwierigsten Aufgaben war es einen guten Kompromiss zwischen hervorragender Grafik und sehr gutem Gameplay (vom Standpunkt eines Designers) zu finden.
  • Herausragende Wassereffekte wird man in Doom 3 wohl vergeblich suchen, nicht zuletzt weil Wasser quasi so gut wie gar keine Rolle in Doom 3 spielt. Es sind allerdings ein paar nette Reflektionseffekte in die Engine integriert, was darauf schließen lässt, dass man vermutlich ein bisschen Wasser zu sehen bekommt…

Leider wurde die Übertragung von Carmacks Rede unterbrochen. Als kleines Trostpflaster findet man auf Planetquake ein Foto von Fred Nilsson (id Software).

Update: Die Rede von John Carmack ist bis zu dem Punkt, an dem TSN Central die Übertragung abbrach, nun auch im MP3 Format erhältlich. Hier gehts zum 15MB Download. Das ganze dauert ein bisschen mehr als 60 Minuten.

John Carmack über ATI, Nvidia und Doom

Bei slashdot.org kann man nun einige Kommentare von John Carmack zu der momentanen Situation der Grafikkartenchips im bezug auf Doom nachlesen:

The standard lighting model in DOOM, with all features enabled, but no custom shaders, takes five passes on a GF1/2 or Radeon, either two or three passes on a GF3, and should be possible in a clear + single pass on ATI´s new part. It is still unclear how the total performance picture will look. Lots of pixels are still rendered with no textures at all (stencil shadows), or only a single texture (blended effects), so the pass advantage will only show up on some subset of all the drawing. If ATI doesn´t do as good of a job with the memory interface, or doesn´t get the clock rate up as high as NVidia, they will still lose. The pixel operations are a step more flexible than Nvidia´s current options, but it is still clearly not where things are going to be going soon in terms of generality. I fully expect the next generation engine after the current DOOM engine will be targeted at the properly general purpose graphics processors that I have been pushing towards over the last several years.