Tuesday, December 30, 2003

IndexBuffer Direct3D Tutorial Available

Looking back, I can see it's been over two months since my last Direct3D tutorial publication. D'oh! Well, it was a busy two months, but don't worry - I haven't given up. My latest installment is finally ready. It covers IndexBuffers, an important part of the basics of Direct3D.

Hopefully I'll be a bit quicker about getting the next one out. As usual, feel free to email me with comments, corrections, complaints, and suggestions about the series.

Monday, December 29, 2003

Privacy Laws Drastically Changed

BitWorking has this piece. Regardless of which side of the issue you're on, US citizens should be aware of this, if for no other reason than to be able to debate it with those that don't share your views.

How Do You Build a Police State?

The same way you eat an elephant. Is it even worth asking where the "professional" media is on this story?

On December 13, when U.S. forces captured Saddam Hussein, President George W. Bush not only celebrated with his national security team, but also pulled out his pen and signed into law a bill that grants the FBI sweeping new powers. A White House spokesperson explained the curious timing of the signing - on a Saturday - as "the President signs bills seven days a week." But the last time Bush signed a bill into law on a Saturday happened more than a year ago - on a spending bill that the President needed to sign, to prevent shuttng down the federal government the following Monday. [San Antonio Current]


My recent entry about my desire for a Wiki that thinks in XML instead of text/html led me to an interesting application. David Pickett pointed me at WikidPad. I've been trying it out, and I have to say I really like it. It's a desktop application that lets you build a series of linked documents in much the same way that a Wiki does on the web. To me, it's sort of like OneNote, except instead of pages and tabs and sections, you get hyperlinks, which is fundamentally more powerful.

I'm definitely going to be shelling out for this one. I mean, it's only $12, and the last release was just last month, so I know it's still in active development). Of course there are things about it that I don't like. For example, while the keyboard shortcuts are plentiful, there are a few more I'd like to have. But I imagine the devs will be hearing from me about all the little nits I don't like. :) And the app is easily powerful enough to tackle the particular task I have in mind for it, which is to organize the hundred or so pages of notes I have for a role playing game I'm running. (Vampire, if you must know.)

At any rate, while it's nice that I have a tool that's good enough (and good), I still haven't found what I was originally looking for, which is the fusion of typed information (i.e. XML), the Wiki build-by-accretion model, and a desktop (not web) application. WikiPad has two of those three, but it's still a collection of untyped (i.e. text) information. Fortunately, the storage format - while not XML - is simple and textual, so if I need to parse it into something else, I can.

By the way, if anyone out there is interested in building something like an XMLWiki, give me a shout. While I don't have time to write the darn thing myself, I do have time to share my vision and to provide guidance to anyone that would like to tackle it as a side project. I think it's a fascinating subject, if for no other reason than that it involves figuring out the right balance between too little and too much structure.

Wednesday, December 24, 2003

XML + Wiki = ??

Dear PWSTM (People Way Smarter Than Me), 

As I've gotten used to using my blog over the last year, I've been thinking more about managing my personal information. Lately, I've been pondering Wikis, a topic about which I know little more than the basics. 

On the one hand, I like the idea of a Wiki for storing notes and ideas. In theory, it makes it easy to maintain relationships amongst various nuggets of information. 

But in practice (the other hand), all the Wiki software I've seen has two drawbacks: 

  1. It's presented as a set of web pages, which means (for me) using it sucks compared to something with a rich user interface, accelerator keys, etc.
  2. The native storage mechanism is HTML. Meaning that there's no way to have typed information, in the sense that it would be difficult for a machine to determine that one page describes a C# class, and another page describes a Weblog entry.

It seems to me that there's a sweet spot somewhere at the intersection of Wikis and XML. What I envision is an app that lets me specify a mapping between some piece of UI editing code (a form, generally), and an XML type. What I'd want is an app that lets me create new instances of those XML types, editing each one with the appropriate UI and then somehow linking them together, in the same way that a Wiki lets you link together HTML documents. If I could somehow get the version control that many Wikis offer, that would be really nice, too.

The advantage of this XML/Wiki hybrid is that it would lend itself very nicely to use for many purposes. Because of the reliance on XML rather than HTML as the fundamental thing that you're storing and editing, it would be fairly easy to have both a web and a rich interface for editing. Or a rich interface for editing, and a transform to produce static HTML read-only copies. And - done right - the mapping between XML types and UI that knows how to edit that type could make the whole thing pretty darn extensible.

As I've been running this idea through my head, I'm wondering if there's already something that does exactly what I want. A few things on my list of stuff that I don’t know enough about, but that might help or solve the problem, and that I really ought to look into:

1)      There's already some Wiki software out there that does exactly this.

2)      InfoPath already does exactly this.

3)      Some other app already does exactly this.

4)      Somehow some app that uses RDF solves this problem.

5)      One of the above comes close, and could be made to do what I want with some tweaks to existing software.

6)      I write my own when days finally have 26 hours. 

I'm planning to look into InfoPath at some point in the near future (as soon as I install it again). But I'm hoping that one of you will be able to shed some light on this for me.

Sunday, December 21, 2003


Years ago, I saw a neat application that would show you your disk drive usage as a series of nested rectangles. This makes it very easy to identify which folder, which subfolders, and even which particular files are eating your harddrive. I completely forgot the name of the original application, but today I ran across one that does the same thing: SequoiaView. It’s free, and it makes cool pictures like this:


I like it.

Saturday, December 20, 2003

Holiday Surprise

So we went to my biggest client's holiday party last night. First of all, I thought it was pretty cool that they even invite consultants. Second, the food was excellent (steak and lobster, if you must know ;)). But I was a bit surprised to find that the entertainment was the band Chicago - a far cry from the usual company band playing covers of old 50s dance tunes!

Thursday, December 18, 2003

Shadow Copies

Daniel Sinclair, fellow DM instructor, turned me on to this very cool feature yesterday. Under Windows 2003 (and possibly other OSes – I haven’t checked), if you go to the properties for one of your local drives, you will see a tab labeled “Shadow Copies”.  

If you enable the feature, clicking on the Settings button will bring you to a dialog where you can set up a schedule. The basic idea here is that every day/week/month/whatever period you select, the operating system will take a snapshot of your hard drive. Only it’s a smart snapshot – it only records the deltas, making for a much smaller image than if I had simply copied the drive somewhere, and it lets you specify how much space the backups should be allowed to take up.

The way you access the drive is via the network redirector. So you simply open up (say) \\localhost\c$, right-click on any file, and pull up the properties. There’s a “Previous Versions” tab where you can view the state of the file or folder you’re interested in at that point in time.

Obviously, this doesn’t take the place of regular backups, but it sure is a nice safety net, especially when I’m on the road and don’t have access to backup.

One caveat: Dan warned me that there might be a performance penalty associated with enabling this feature. I’m not sure if I’ve noticed anything yet – my hard drive is a bit busy this morning, but I haven’t tracked down what the source of this is yet. It could be something else. At any rate, subjective performance certainly hasn’t suffered that much, if at all.

Thanks Dan!

Monday, December 15, 2003

So You Want to Work at the NSA?

Bruce Schneier's latest Crypto-Gram Newsletter (highly recommended), has a link to an amusing and interesting account of one interviewee's experience applying for a job at the National Security Agency (NSA).

Sunday, December 14, 2003

Swiss DevDays Slides and Demos Uploaded

I’ve posted the slides and demos for my presentations at DevDays last week. You can find them here.  Here are the descriptions:

Multithreaded Programming in the CLR: The CLR provides full multithreading capabilities to programmers of all languages. This includes the ability to start threads, to stop them, and to safely communicate with them. In this session, we discussed both how to begin asynchronous processing, and how to safely share data between threads.

Introduction to Managed DirectX:  The DirectX library has for years provided C++ developers with the ability to closely interact with sound, video, input, and networking devices. The advent of Managed DirectX brings these same technologies to the C#/VB.NET/other managed language developer. In this session, we looked specifically at the basics of Managed Direct3D, Managed DirectSound, Managed DirectInput, and Managed DirectPlay, with an emphasis on how these technologies could be used in non-game applications.

If you’ve been reading my Direct3D Tutorial, you’ll probably find the latter set of demos particularly interesting, especially the Collider series of demos. It develops a simple but fairly realistic non-game application that models particle interaction. Through a series of demos, it evolves to use 3D graphics for display, 3D sound to report collisions, joystick input for navigation, and finally DirectPlay for distributed rendering.


Saturday, December 13, 2003

Excellent Game Programming Book

I’ve lately been reading 3D
Game Engine Design : A Practical Approach to Real-Time Computer Graphics
. It is
excellent. It’s exactly the book I’ve been looking for on my quest to
understand Direct3D better – it starts with talking about how a generic render
pipeline works and from there moves into things like collision detection, terrain
generation, and all sorts of good stuff.

But before you run out and buy it, a few caveats:

·        The
book is extremely thorough. In graphics, this means there are scads of equations.
While I think it’s important to understand vector and matrix operations if you
want to do 3D game work, I’ve been able to read the book without actually going
through most of the proofs and derivations. “OK, there’s a formula for
intersecting a triangle and a sphere. I’ll come back to the details if I ever
need them. Knowing where the formula is suffices for now.”

·        It
is not DirectX-specific. It’s fairly technology-agnostic, in fact. So if you’re
looking for something to teach you the API, don’t look here.

If you want some good theory and a big pile of pseudocode, I recommend it.

Friday, December 12, 2003

Site Upgrade Completed

Well, it was sort of painful, but I got the site up and running on dasBlog. Which,
I have to say, is a fairly big improvement on BlogX. If I can convince the powers-that-be
to somehow install CDO on the web server I run the site on, it’ll be even better,
since dasBlog has some nice email-related features.

Sorry if your aggregator blew a bunch of recent entries in your face as being new
– not much I could do about that. At any rate, it shouldn’t happen again
any time soon.

If you have any problems with the site or the feed, let me know by leaving a comment
here or by sending me an email, and I’ll get it fixed.  

Site Maintainence Today

I’m going to be cutting over to dasBlog today, so you may see intermittent availability
of this site. Further, your aggregator may show you a whole bunch of old posts
as being new ones. Apologies on both counts!

At any rate, no change of URL for either HTML or RSS content is expected – I’m
doing a bunch of work to ensure that nothing moves around on you from a URI standpoint.

Thursday, December 11, 2003

Unexpected Repercussions

Here’s a little something I wrote last week that somehow never made it onto
this blog:

I’m here at DevDays in
Zurich, Switzerland. My talks start in about two hours, and I find myself with a few
spare minutes.

I just got done listening to an interesting presentation by Rafal
on IPv6. It was interesting not only because Rafal is a good speaker
and the topic a relevant one, but because of a comment he made that resonated with
me, particularly so since I was (probably) the only American in a European crowd of
300 or so.

Rafal observed that the current version of IP (IPv4) has an incredibly uneven distribution
of addresses. In particular, he noted that some US universities have more addresses
assigned to them than the entire continent of Asia. While I found that surprising
enough, what really caught me was the implicit sense that having a global Internet
infrastructure that depends disproportionately on the US is a bad thing. This makes
logical sense for purely technical reasons, of course, but Rafal’s comments
made it clear that from a social standpoint, the appeal of IPv6 would not be
hurt by the current political climate.

I find the idea that the adoption of IPv6 might be due in part to the current US administration’s
global policies [1] amusing, troubling, and an excellent example of the Law
of Unintended Consequences.

[1] Policies with which I largely disagree, but I’ll leave it at that in this

Tuesday, December 9, 2003

A Bit of an Earthquake

You may have seen on the news that there was a small earthquake near Richmond, Virginia.
I live near Washington, D.C., about 100 miles from the epicenter. Here’s how
it went for me:

Wife: Did you hear something?

Me: No.

Wife: I’m from the Pacific Rim. That felt like an earthquake.

Me: I didn’t feel anything.

She was right, of course. She usually is.

Sunday, December 7, 2003

New Articles Added

Frederic just shipped me the latest Direct3D translation, so you can click here to read “Rendering Basics” in Danish.

I’ve also started a new article, Craig’s Random Link List. For now, it’s just a few things I’ve thrown together, but I’m hoping to build on it over time to keep track of things I’ve found useful and/or interesting.

You might also have noticed that I’ve moved pretty much all the writing over from my old site. With luck, I can shake some time free this week to get redirects in place so I can have a single source of truth, which will allow me to maintain everything in one place without breaking any existing links.

Back from Switzerland

Well, we made it back from Switzerland. West is the easy direction in terms of jet
lag, but I’m starting to droop pretty badly at the moment. Hopefully that means
I’ll get a solid night’s sleep tonight – I woke up at midnight Zurich-local
time last night, and couldn’t get back to sleep before the alarm went off at
3:15AM so we could catch our flight.

At any rate, the conference was great. Most of the sessions were in German, but there
were a few in English. Based on the ones I did catch, I think the attendees got a
really great deal. And I had a great time delivering my talks on threading and on
Managed DirectX. The Managed DirectX one in particular went quite well – I managed
to cover the absolute basics of Direct3D, DirectSound, DirectInput, and DirectPlay
(including nine demos that all worked!) in 80 minutes. The slides and demos should
be up on the conference website either now or in the near future, but since I can’t
read German, it’s pretty hard for me to figure out which link it is, so I’ll
be posting the content here as well. Just give me a few days to dig through my email
and recover mentally.

The conference was also enjoyable in that I got to see some old friends as well as
make some new ones. I’ll refrain from doing the Great Blog Name Drop and listing
them, but I have to thank Sascha Corti in
particular for acting the perfect host. My wife and I both felt extremely welcome
and well taken care of.

Now that I’m back and no longer have conference talk prep work to do, I can
devote some time to a few projects that I’ve had on hold. In particular, I want
to get my blog ported to dasBlog 1.5 (just
), to write the next article or articles in my Direct3D series (I’ve
done a bunch of reading and understand some things better now) and to finally get
a start on one or two games that I’ve been wanting to write. We’ll see
how it all goes what with the holidays and all the other, normal things that need

Saturday, November 29, 2003

Today I Am 0x20

Today I am 0x20. It's a nice round number. I think I'll waste it playing Splinter Cell. :)

Wednesday, November 26, 2003

Away From Keyboard

This week is the Thanksgiving holiday in the US. Next week I'll be in Zurich, Switzerland, speaking at DevDays. As I have no idea if I'll even be able to get online in Europe, the blogging will be a bit light for a while, and I might be a bit slow on the email uptake.

The good news (unless you think not hearing from me for a bit is the good news) is that with my DevDays slides finally done, I can spend my weekends on something else. I plan to use some of that time to write a bit more in the ol' Direct3D Tutorial Series, which I have sadly neglected. That plus a port to dasBlog should keep me busy.

Monday, November 24, 2003

Test Driven Development Introduction

Peter Provost has a great introduction
to test-driven development here.
 Although I haven’t yet read any of the Extreme Programming canon, I have
been doing a lot more programming over the last year than was previously the
case, and I find myself gravitating naturally towards some of the precepts of XP.
I’d love to give Test-Driven Development a full-blown try, but when you’re
a consultant, you have to go with what the client wants. Maybe I can find a way to
work it in to one of my side projects.

Found Some New Shortcut Keys

I've always been a big fan of shortcut keys. Today I happened
to mistakenly hit a few keys, and discovered a few more I never knew about (this is
on Windows XP).

WinKey + LeftArrow = Minimize Current Window

WinKey + UpArrow  = Maximize Current Window

WinKey + DownArrow = Restore Current Window

Now if I could just find the shortcut key for FixBug.
Or better yet, CureSleep.

Update: I'm a big fat idiot.

So it seems these aren't commands that are built into the shell, but rather just some
of the default mappings added by WinKey.
WinKey totally rocks - I use it so much, I even forget that it's there. Like I did
for this post.

Friday, November 21, 2003

ASP.NET Quirklet - Nested VDirs

I ran into something that took me almost all of yesterday to figure out. I was trying
to set up a series of virtual directories in IIS. The basic idea was that I had a
set of ASPX pages and a set of ASP.NET Web Services, and that they should be organized
under a single URL, like http://localhost/MyApplication.

I couldn’t put everything into the same virtual directory, because we already
had a disk structure defined that located the ASMX and ASPX pages in different places.
So I figured I’d map /MyApplication to the ASPX directory, and just create a
child virtual directory like /MyApplication/WebServices that mapped to the ASMX directory.
(You did know that virtual directories can contain other virtual directories, didn’t
you? And that they can point to any physical directory, even one that isn’t
a child of the parent vdir’s physical directory?)

The problem I ran into immediately with this approach was that some of the settings
in my ASPX web.config were incompatible with my web services. This is a problem because
many web.config settings are inherited by child virtual directories. My solution to
this was to create two children of /MyApplication - /MyApplication/UI and /MyApplication/WebServices.
Both children had to be proper virtual directories, and not just regular directories
underneath a virtual directory. This is because

1)      The
two physical directories in question don’t share a common parent physical directory,

2)      Some
of the things in my web.config file only work when the web.config file lives in a
virtual directory root.

It was a bit annoying that I was going to have to type http://localhost/MyApplication/UI
instead of http://localhost/MyApplication to get to the ASPX pages, but it’s
a minor annoyance, and if it really bugs me I can fix it later with a server-side
remapping or a simple client-side redirect. So I mapped the parent virtual directory
off to a random empty directory on my hard drive, mapped the two child vdirs off to
the appropriate place, and went on coding.

Everything was going great until I tried to install the stuff on someone else’s
machine. I’d written a setup script, and although it seemed to run to completion,
the app didn’t work when I was done. It was failing in all sorts of weird ways,
doing things it just shouldn’t. Most troubling was when it would error out because
it couldn’t find the current username…because even though I had ASP.NET
Forms Authentication set up, it was never redirecting to the login page!

Ultimately, I found the answer through trial and error. It turns out that during setup,
the check-out from source control wasn’t creating the empty directory that the
parent directory virtual directory was mapped to. And for whatever reason, if you
have nested virtual directories, if the parent directory does not exist, ASP.NET
completely ignores the web.config files in the child virtual directories
. Leading
to the highly confusing behavior I mentioned. The solution was simple: just make sure
the parent vdir points to a valid physical directory, and everything returns to normal.

Wednesday, November 19, 2003

Keith's Book Feed

Keith Brown has his new security book online. And when I say “online”, I mean he’s actually writing the darn thing right there on his web page. Even better, the book has an RSS feed. Subscribed!

His last book was excellent – I’m sure this will be the same.

Monday, November 17, 2003

I Am the #1 Google Hit for "Build An Army of Giant Nuclear Powered Robots"

How cool is that? Evil supervillians of the world, eat my Google dust! :)

It’s this previous post, by the way.  Kudos to Rob Engberg for both inspiring the original post and pointing out the tasty Google goodness.

Who Needs an Architect?

I think this is
a great, short read. It takes a crack at defining a taxonomy of the role of "architect"
and does a passable job of explaining why the term is so nebulous to begin with. The
dig at the AOP at the end didn't hurt my opinion of the piece, either. ;)

The last company I worked at called pretty much everyone
an architect. That always seemed weird. Now I have a better idea why.

Saturday, November 15, 2003

ftpsync Alpha 1

I’ve just created a new GotDotNet workspace for ftpsync,
the tool I’ve been writing this weekend when I should have been doing work.
But you know how it goes – you sit down to update your webpage, you get tired
of trying to figure out which files you’ve changed since the last upload, you
look around for a free tool that does incremental FTP upload, and you don’t
find anything free and good. So you write one yourself…we’ve all been
there, right?

It turned out reasonably well, given that I only worked on it for a few hours, so
I figure I’d throw it up on a workspace and see what happens. It needs better
error handling, and I’m not super-happy about the FTP library I’m using
which saved me a ton of time, but has an interface which is a bit clunky, and has
somewhat weird error handling. Of course, it works and it’s free, so I’m
not complaining too much.

Basic usage of the tool goes something like this:

ftpsync -s my.ftp.server.com -u myusername
-p mypassword -ld C:\local\directory -rd /initial/remote/directory

This would cause the program to upload via FTP anything that lives in C:\local\directory
that either

1.      Isn’t
present on the FTP server, or

2.      Is
present on the FTP server but is older that what’s in the local directory

You can add the -r switch to recurse directories, and the -rd switch to delete anything
on the remote server that isn’t on the present server. I also threw in support
for the ignore files (-if) and ignore directories (-id) switches. Oh, and a –debug
switch that spews tons of extra info.

Anyway, I’ve already had success integrating this into a Nant build that I run
to update this website, so I’m getting my money’s worth. It seems stable
enough, but if there are feature requests or bug reports, leave a note on the workspace…or
better yet, join and upgrade/fix it yourself ;) .

Care for a Danish?

The wonderful and amazing Frederic Gos has taken me up on my request for translations. The first few articles of my Direct3D series are now available in Danish. Thanks Frederic!

I’ve got a few other languages in the pipeline – these articles get a fair few hits a day, so if you’re looking for more blog traffic, and would like to do a translation – I’m more than happy to include a link to your website from the articles.

Command Shell Liberation

I’ve been on a bit of a quest lately, looking for a better command line experience. I tried bash for a while (under Cygwin), and that was pretty cool. It took me back to my Unix days, and is clearly a first-rate product. But I kept running into differences between Windows idea of paths (e.g. C:\data\writing) and Cgywin’s idea of a path (e.g. /c/data/writing). Personally, I like forward slashes better, but since I’m stuck with Windows pathnames for a lot of applications, I was sort of screwed.

During the course of thinking about the problem, I actually realized that I really have two requirements that I had been confusing:

1) A nice interactive shell for doing things like executing programs and listing directories.
2) A scripting language for automating repetitive tasks.

Shells like bash and tcsh address both features, but really, they’re two separate things, and a large part of the value comes from all the other little command-line tools that are part of something like Cygwin, rather than from the shell itself. Those are available separately, so I figured if I could find two different products to satisfy both needs, then I’d be set. I’m playing with Python to see what I think of it for requirement #2. C# is an option there, too.

To figure out what to do about my first requirement, I asked around a bit, and some hard-core developers over on the Off-Topic Mailing List pointed me to 4NT as a good replacement for the Windows command shell. I downloaded the trial and had to agree that it is indeed pretty darn cool. But I think I’m going to stick with cmd.exe. The big reason I’m even contemplating this is this registry key:

HKEY_CURRENT_USER\Software\Microsoft\Command Processor\AutoRun

According to the documentation, if you list a file under this key, it will be run at the start of every cmd.exe session. Well, this is awesome. It means that I can set it to something like C:\home\cmdrc.cmd and then populate the file with something like this:

@echo Running cmdrc.cmd
@SET PATH=c:\bin;%PATH%
@SET PATH=c:\bin ant-;%PATH%
@call vsvars32.bat

Now, whenever I fire up cmd.exe, it sets my path appropriately, and executes the vsvars32.bat file, which does all sorts of environmental goodness to enable programs like the C# compiler from the command line. Best of all, when I add a new program that I want to be able to run from the command line, I just have to throw a new line into my cmdrc.cmd file, and restart any command shells I have open. This is waaaay better than having to much around with the environment variables dialog box and having to kill explorer.exe. Plus, when I reinstall my system, it’s just a matter of copying this file over to preserve all my hard-fought configuration.


Thursday, November 13, 2003

WSDL.EXE Replacement Sought

I'm running smack into two big limitations of WSDL.EXE,
the tool used to generate client-side proxies for web services in .NET. Specifically,
here are the problems:

1) WSDL.EXE relies on the same code as XSD.EXE to map
the XML types into programmatic types. Unfortunately, it generates types with public
fields rather than properties. This prohibits data binding. I'd like to change this.

2) If you run WSDL.EXE against two different WSDL documents with exactly the same
XSD type in them, it generates two programmatic types. That is <foo/> turns
into NamespaceA.Foo and NamespaceB.Foo. This is a problem if you want to read a Foo
from web service A and pass it to web service B.

Neither of these problems are insurmountable. The problem
is the solutions aren't elegant. Better than either would be for someone to tell me,
"Hey, you just need to download SuperWsdl.exe; it does everything you need." Unfortunately,
I'm not sure SuperWsdl.exe exists. I'd prefer not to write it myself.

Anyone got any advice? Leave a comment.

Copy and Paste Broken in IE?

So, is it just me, or is Copy and Paste acting up in IE
lately? I'm seeing this on mutliple machines not even running the same OS, but half
the time when I copy something from a web page via Ctrl-C, it doesn't take. I have
to go back and copy it again.

It's enough to make a guy switch to another browser.

Wednesday, November 12, 2003


If you don’t subscribe to Rory’s
, you should. I find it hilarious.

Rory, count me as a fan of yours.

Tuesday, November 11, 2003

Favorite &amp;quot;Beyond Fear&amp;quot; Quote So Far

I’m still reading Beyond
 a bit at a time…it’s still excellent. As a species we just don’t get
security, something that has become even more clear in the US in the last two years.
My favorite quote so far:

I have a friend who has, on almost every one of the many flights he has taken since
9/11, presented his homemade “Martian League” photo ID at airport security
checkpoints – an ID that explicitly states that he is a “diplomatic official
of an alien government.” In the few instances when someone notices that he is
not showing an official ID, they simply ask for a valid driver’s license and
allow him to board without a second glance. When he noticed that the gate agents were
scrutinizing expiration dates on IDs, he simply made another version of his Martian
League card that included one.

Aside from being amusing, what’s interesting is that – in the context
of the book – the conclusion we draw isn’t, “Airport screeners are
stupid,” but rather, “Authentication is a hard problem, and is misapplied
and inappropriately used at airports.”

Monday, November 10, 2003

Developer Focus

Chris Brumme posts this
interesting piece
. My favorite quote:

In V1, ASP.NET hit
a home run by focusing like a laser beam on the developer experience.  Everyone
put so much effort into building apps, questioning why each step was necessary, and
refining the process.  It's great to see that they continue to follow that same
discipline.  In the drill-down sessions, over and over again I saw that focus
resulting in a near perfect experience for developers.  There
are some other teams, like Avalon, that seem to have a similar religion and are obtaining
similar results.  (Though Avalon desperately
needs some tools support.  Notepad is
fine for authoring XAML in demos, but I wouldn't want to build a real application
this way).

really reinforces what I've been learning over the last year, as the amount of "real
code" I've been writing has skyrocketed from my previous mostly-research-and-sample-code
lifestyle. API design is tough, but when you get it right, it makes a world of difference. Don talks
about this on MSDN
, where he speaks at some length about the value and danger of abstractions.
It's one of the main reasons I have hope for Indigo. And it's one of the main reasons
that I worry about WinFS: if the API anything like the ADSI model (or the OLEDB model),
forget it: no one will ever use it. Those that do will hate it. But I've looked
at neither in depth, so this jury is still out.

Making Sense of Source Control

Most of my development work until recently has been either
without source control (gasp!) or using Visual SourceSafe. Well, for the last year
I've been doing a lot of development, and have unsurprisingly come to really like
source control. At home, I use CVS to do things like track changes to my website,
so I can roll back if I screw something up. At work, I use it for the usual purpose:
coordinating work on the same codebase with other developers.

One of my clients is currently evaluating a new source
code control system - Borland's StarTeam. It looks pretty nice, and there's a reasonable
chance they'll implement it. Since I'm doing prototype work for them, I'm using it
exclusively right now. Having experience with CVS and one or two other non-Visual
SourceSafe products, it wasn't too hard to get used to the model. But it's going to
be a stretch for some of their developers who haven't used a merge model source control
system before, the same way it hurt my brain a bit when I moved off of SourceSafe.

One of the things that I've found helpful is to picture
the various states a file can be in as a grid. The grid tracks the state of the file
in the repository versus the state of the file in the working directory. The file
can be in one of three states in each of these two places: unchanged, changed, or
not present. "Changed" and "unchanged" are relative to the file in the other location,
so "changed" on the repository axis means that the file in the working directory has
changes relative to the working directory (i.e. someone else has checked in a new
revision since you checked it out).

Because one of the things that differs between source
control systems is the terminology, the really valuable part of the grid (for me)
is the intersections, where I record the meaning of each of the possible combinations.
Here's what the grid for StarTeam looks like:

Working Directory

     Unchanged    Changed   Not Present

-------   -----------

Unch |   Current     Modified    Missing

Repository       Chg | Out of date    Merge  

            Not Pres |
Not in view  Not in view    N/A

It should be fairly straightforward to substitute the
terms from your source control system for the StarTeam ones. Note that this chart
is a simplification of what's really going on - it doesn't help you figure out branching,
for example - but I found it useful as a starting point, and hopefully this will help
someone avoid one or two of the mistakes I made while getting used to the new systems.

Friday, November 7, 2003

C# Direct3D Sprite Example

David Weller has a short
of using the Direct3DX Sprite class from C#. The example only works if
you have the Summer
Update of the DirectX SDK
, and even then I have had a few problems. Regardless,
I’m sure David will fix them soon, and the code alone is very handy to demonstrate
technique. You’ll find it particularly useful if you plan to write a 2D game
using Direct3D. Which I will,
when I get some free time.

First, I need to finish writing my slides for Swiss

Thursday, November 6, 2003

Measure, then Optimize

I posted yesterday about this article, which talks about the performance of various operations in the CLR. I said it was a good article, and I still think it is. But Ian Griffiths wrote me up to take issue with the fact that that's all I said - he felt that the article in and of itself does not actually tell you anything directly useful...and I agree.

Ian and I have both done more than our share of optimization, and we've both arrived at the same set of rules:

1. Don't do anything intentionally stupid when first writing the code, but 
2. Don't spend a lot of time trying to write really fast code up front. Instead,
3. Measure, then optimize the slowest thing.
4. Repeat until performance is good enough.

These rules will hardly be surprising to anyone that's succesfully done performance improvement work. But they surprise the hell out of a lot of people nonetheless. "I thought writing fast applications was all about knowing which sorting algorithm to use and which data structure to pick?" Not really.

I used to work at a mortgage company where I sat next to Jon, a good friend of mine. People would often come to me with a C++ program and tell me that it was too slow. The first question I would ask them is, "Have you talked to Jon yet?" When they said, "No," I'd tell them to go away. See, Jon used to work at Oracle, developing bits of the database, so he knew SQL up and down. He would routinely make queries that originally ran in half an hour, run in 30 seconds. As a pretty good C++ programmer, I could expect to decrease execution time by about 10%, or maybe 25% if I was really lucky. A sixty-fold performance improvement was out of my league...but Jon did it all the time.

As another example, while profiling a web service I've been writing, I found that the following line of code was the slowest thing in the app:

RSACryptoServiceProvider rsa = new RSACryptoServiceProvider(cp);

And I mean it was by far the slowest thing in the app - making this call less frequently had a significant impact on throughput. I don't remember what the timing on the call was exactly, but I never would have guessed that a constructor call could take as long as this did.

All this goes to explain why I claim that the article is useful, but not directly useful: because when it comes time to optimize, you have to measure the slowest thing and fix that. Anything else is a waste of time. And you'll probably be surprised by what the slowest thing is. And it's likely not going to be slow due to any of the things from the article - at the point where you're worrying about the performance of fields versus properties, you've almost certainly already optimized a whole bunch of other stuff that's going to be the dominant factor. But at the same, there may come a time when knowing that the C# as operator is twice as slow as the cast operator in some situations might save you a few hours.

The life of a performance optimizer is a tough one: you have to know everything about how the app works, down to the silicon, since you never know what the bottleneck is going to be. (This is an outcome of the Law of Leaky Abstractions.) But since this is waaaay too much to keep in your head for even a trivial app these days, we need to just make our best guess when first writing the app, then measure to zoom in on the places where bad things are happening.

Oh, and don't forget step #4: stop optimizing when performance is good enough - you're just wasting money after that. Of course, that assumes that you can get a definition of "good enough" from the customer, but maybe that's a post for another time. ;) In the meantime,

Wednesday, November 5, 2003

Writing Faster Managed Code

Keith Brown pointed out this
, which is a great read. Very technical.

Beyond Fear

I started reading Bruce Schneier's Beyond
 last night. It looks like it's going to be just great. He's an
excellent writer, and the material is so relevant. More importantly, it says a lot
of totally true stuff that is completely counter to conventional "wisdom".

So far, the biggest theme is "Develop a threat model",
or, if you like, "Know thy enemy." So often, people post questions on the mailing
lists I frequent that go something like, "Should I use encryption?" And the answer
is always, "Who are you trying to protect against?"

Unfortunately, answering the latter question is harder,
probably because it is often out of the programmer's control. If more managers, executives,
and end users would read Schneier's book (as well as the excellent Secrets
and Lies
), talking and making intelligent decisions about security would
become much easier. Order a copy for your boss today. ;)

Tuesday, November 4, 2003

TV's Customer Effect

By way of Tim
, I read Ole Eichhorn’s rant
about Longhorn
. While I find it interesting and in places insightful, I think
he was a bit wide of the mark. The basic mistake he made is an extremely common
one: the assumption that all software is the same. Usually, this is coupled with “and
it’s the same as the software I write.”

I’ve fallen victim to this particular trap myself. One of the major foci in
my career has been the development of large-scale systems. As such, you’ll often
find me saying things like, “Extra database roundtrips are bad.” Or, “Avoid
building objects that map directly to tables and handle their own persistence.”
Both of these are good pieces of advice…if you’re building a system that
needs to scale. But of course, if you’re writing a system that’s going
to be used by ten people ever, you should ignore this and do what’s easy. And
lots of people are indeed writing these sorts of smaller systems, which is a totally
legitimate thing to do.

The keys to being a good architect are 1) knowing what the rules are, and 2) why they
are that when so you know when to ignore them.

So, does Ole have a good point about performance? Yes…if it’s going to
impact you. But in point of fact not everyone is writing the next version of Excel.
There are plenty of applications for which developer productivity is more important
than an extra 10% (or whatever) in response time. And of course the jury’s still
out on what the performance penalty will be. And of course it’ll be different
for every application anyway. But in any event I’d argue that there are at least
ten times as many departmental one-off applications being written as there are commercial
shrink-wrap ones. Whether or not these are “important apps” (Ole’s
term) sort of depends on where you sit.

But Ole’s bit does beg the question, “Is Microsoft right to continue to
emphasize developer productivity?” I think it’s obvious that they are
– the CLR’s big win is clearly this, and it looks like the Longhorn technologies
continue largely in this vein. But you have to ask, “Is this good for the users,
who outnumber the developers by some large factor?”

I don’t know the answer to that. What I do know is that it reminds me of another
familiar market: commercial broadcasting, particularly TV. I often wonder at how bad
broadcast TV is. Sure, there are some shows I like, but a lot of it sucks. There are
two conclusions I can draw from this: 1) that most people have taste that differs
from mine, or 2) the viewer’s preferences are not the controlling factor. While
the former is probably true, it’s the latter factor that’s interesting
here, because from a commercial standpoint, it’s demonstrably true. Advertisers,
after all, pay for the programs, not viewers. So the broadcasters are directly beholden
to advertisers, but only indirectly beholden to viewers. And note that for
the networks where this is not true, like HBO and public television, we get a noticeably
different spectrum of programming.

In Microsoft’s case, we actually have the opposite correlation. The revenue
comes from products like the OS and like Office, and not from Visual Studio.NET and
the CLR. Sure, there’s a linkage – developers create the software that
users’ consume, making the platform more attractive to said users – but
again it’s indirect rather than direct. I’m not sure what
this means…but I find it interesting.

Monday, November 3, 2003

Translations Encouraged

If anyone is interested in translating any of the DirectX
(or other) writing on this website into different languages, please let me know. The
Direct3D tutorial in particular gets a fair amount of traffic, I've noticed referrers
from the Babelfish translation service, and have had requests for the content in other
languages before. This leads me to believe there might be a reasonable demand for
tutorial material in other languages.

I'd be more than happy to let you post the content on
your website if you like, as long as you let me post it here too, and would agree
to link back in your copy.

Email me directly or post a comment if you're interested.

Friday, October 31, 2003

MONAD - the Next Generation Command Line

I’ve been living on the command line for a while now, mostly in Cygwin’s
Bash shell. I like it, but it’s not perfect. Nothing is, of course, but maybe this will
be a bit closer.  

Monday, October 27, 2003

Sometimes It Makes You Proud to Be an Alumni

CNN article
talks about how some MIT students have figured out a a legal way to
stream MP3s and cut the record companies out...fantastic! Just because
I don't like it when people copy music without paying for it doesn't mean that I side
with the RIAA. Quite the opposite. So the fact that they figured out how to have it
both ways makes me happy.

A Retratction

Recently, I mentioned that I thought the Microsoft Application Blocks were not mature. I sited a general documentation bug and a Configuration Management Application Block threading bug as key indicators that the software isn’t ready for prime time. And I said that I hadn’t heard back from them after reporting the bug.

Well, I’m going to eat my words on two of those: the threading bug doesn’t exist, and not only did the Patterns and Practices group get back to me, the email was written before I posted the blog entry…I just hadn’t checked that particular email inbox that day. And the team is aware of the documentation bug and plans to fix it.

For the record, the threading bug I thought I saw was related to a Hashtable maintained in the MemoryCache class. I noticed two things about it: First, that they weren’t using the Synchronized wrapper. Second that even though they were taking a lock on writes, they weren’t locking on reads. This is not, in general, safe to do. In particular, it’s not safe to enumerate over a Hashtable at the same time a write is occurring.

As it turns out, the CMAB implementation is safe because they never enumerate over the collection. And since the Hashtable is a private field, no one else can, either. Not using a Synchronized Hashtable is fine in their case because they do the lock on write (which is all the Synchronized wrapper does anyway). Locks on simple reads through the indexer are not necessary with a Hashtable – only on enumeration.

On top of all this, the response from the Patterns and Practices team was really good. I think I got about four phone calls and twenty or so emails from Microsoft people in response to my feedback. And the dev lead himself walked through the code, took the time to verify that the problem I thought I saw wasn’t there, and let me know…over the weekend. Kudos!

Now I just have to figure out if it’s worth it for me to go in and change my code around to use the Configuration Management Application Block again. I wrote my own version that I’m employing, and of course it’s slightly different, and tailored specifically to my view of the configuration management problem. Microsoft’s stuff is obviously superior in many respects, as it has been more carefully architected and more thoroughly tested…and they’ve clearly got a great team backing it up. The only thing stopping me from leveraging their stuff is that what I’ve got now works. I think. :)

Friday, October 24, 2003

C# 2.0 Spec

Niels reports that the C# 2.0 (Whidbey) spec is now online here. Guess I need to find some time this weekend!

Wednesday, October 22, 2003

Microsoft Application Blocks Not Mature

So, I’ve spent some time over the past couple of weeks looking at Microsoft’s
Application Blocks. If you’re not familiar with these, you should know that
they’re basically software written by the Patterns
& Practices group
that’s aimed at solving common problems. For example,
the Data
Access Application Block
is a set of helper classes aimed at making data access
easier. They’re generally the sort of thing that you write yourself after you’ve
written the same two hundred lines of code three or four times. Only now Microsoft
writes it for you. They even give you the source code.

Unfortunately, as much as I hate to say it, I think these aren’t quite ready
for prime time. I really wanted to like them, not least because I had a nice conversation
with Ron Jacobs from the Patterns & Practices group about them, and he seemed
like a smart, helpful guy. But there are a bunch of little things that just tell me
I’m not likely to get much use from them. Like:

·        The
Data Access Application Block currently only works with SQL Server. You can get bits
from gotdotnet that make it work with Oracle, but it doesn’t do anything to
make the differences between the two databases go away. Not that I necessarily think
they could – but at that point, what value am I getting from the code?

·        The Exception
Management Application Block
only does some of what the Enterprise
Instrumentation Framework
does. Granted, it was written sort of simultaneously
with EIF, but again, where’s the value if there’s something better available?

The really telling ones, though, are things like:

·        The
documentation contains factual errors, claiming at one point that the CLR will load
the assembly with the highest build and revision number where the major and minor
version numbers match the requested version. This was true briefly in one of the betas,
but has never been true in the released CLR.

·        The Configuration
Management Application Block
– which at first looked to be one of the most
useful of the bunch – contains some fairly rookie-type threading bugs.

These last two just kill me. If I’m really going to use this code in my products,
it has to inspire confidence that it was written by someone who knows the problem
domain at least as well as I do. Demonstrating a misunderstanding of the platform
and of basic threading principles don’t achieve that. I emailed the team about
the thread bug, hoping to hear back from them, but it’s been over a week and
nary a word, so here I blog.

That said, I should point out four things. First, that these blocks are the group’s
first attempt, and no one gets anything right on the first try. I expect their next
version will be significantly better. Second, that I didn’t get elbow-deep into
all the blocks, so I might be missing a cool one. Third, that what doesn’t work
for me might work for you. And finally, that the group produces a lot more than just
these blocks, including some fairly decent whitepapers on things like security and team
development with VS.NET
– all stuff I’ve heard my clients beg Microsoft
to do for years. So they’ve moving in the right direction.

Friday, October 17, 2003

New Direct3D Tutorial Available - Dealing With Device Loss

 The Direct3D device can become lost due to a variety of events - screen savers kicking in, computers being locked, or machines going into standby. In this tutorial, we'll see how to gracefully deal with these events.

This Weblog Unavailable Saturday and Sunday

Because of a scheduled outage by the power company that services the DevelopMentor offices in California, this weblog (and all other staff.develop.com resources) will be unavailable from 11PM PST, Saturday, October 18th, 2003 through about noon PST on Sunday the 19th.

Note that this outage will not affect the main DevelopMentor website, nor will it affect the discuss.develop.com mailing lists.

DataSet Computed Columns

My client asked me about a slightly tricky problem the other day. I thought I’d share the answer here, since I thought it was fairly clever.

The problem stems from the fact that they have a database that has a table with columns that can hold values like ‘Y’, ‘N’, ‘0’, or ‘1’ to indicate true/false. Yes, they know this was probably not the best way to do it. No, they can’t change it, because there’s still a ton of existing code that both reads and writes this data. So the question was, “When writing new code, how do I get this to show up as a Boolean value in the DataSet?”

My first thought was to simply take care of the conversion in the SQL. If we knew which database we were using, we could fairly easily write our query to simply turn ‘Y’ and ‘1’ into true, and everything else into false. Unfortunately, the application has to work against both SQL Server and Oracle. While I actually have a way to deal with the differences in SQL between the two (more on that another time), I was still hoping to find some magic in the System.Data stuff that would take care of it for me.

Bob Beauchemin – database guru for DevelopMentor and all-around nice guy – turned me on to a nice way to handle it. He pointed out that the DataSet has the ability to create derived columns, where the value of the new column is computed using the value of existing columns and/or some simple operators. Perfect! A quick test showed that the code for creating my Boolean column looks something like this:

// Retrieves a DataSet from the database that has a Y/N/0/1 column called tfCol
DataSet ds = GetDataSet();

DataColumn dc2 = new DataColumn("derivedCol", typeof(bool), "(tfCol = 'Y') OR (tfCol = '1')");



That’s really all there is to it – every row will now have a Boolean column that will be true when the original column is ‘Y’ or ‘1’, and false otherwise.


The important part is the expression in the new DataColumn (which I’ve called derivedCol). I’ve bolded the expression. Notice that I have the ability to use comparison (=) and logical (OR) operators. There are a bunch more things you can do, too. They’re listed under the DataColumn.Expression property in the docs.


One other cool idea that Bob had was that I could suppress the original column (the one with the Y/N values) from the serialized representation of the DataSet by also setting the ColumnMapping property of the DataColumn to MappingType.Hidden, like so:


ds.Tables[0].Columns["tfCol"].ColumnMapping = MappingType.Hidden;


If I do this, when serializing the DataSet to XML – as happens when returning it from a WebMethod call – the tfCol column will not be included in the data at all…but my derived column will. Which means that this is a fairly reasonable and easy way to clean up the data before sending it off to the client.


Wednesday, October 15, 2003

Web Of Trust - Blog as Authentication

I ran across Thawte’s Web of Trust page the other day. It’s an interesting idea, and appealing to a cheap bastard like me since it’s free. And as email signing has been suggested as a technology that could help combat spam, it appears to be worth looking into.

The part I don’t like, though, is that I have to appear before a Notary in person. See, I feel I have a fairly strong online identity – most people reading this probably feel pretty confident that I am in fact Craig Andera, who works at DevelopMentor and maintains this blog. So I’d like the opportunity to be awarded a cert based on readers of this web log vouching for my identity. Basically, I’m viewing my blog as an authentication mechanism – a sort of voiceprint if you will.

Now, I certainly understand why they haven’t set it up this way – in general blogs are a pretty weak form of authentication, and anyone could claim to be someone else by setting up a weblog. But since I’m lazy and selfish, I just think about my own particular case. The fact that I’m on the develop.com domain and have been writing about generally the same things for months now could in theory act as a reasonable assurance of my identity.

Monday, October 13, 2003

Where are the Good Books?

I don’t claim to have read all the DirectX books on the market. These days there’s
such a flood of material that reading everything on any topic would be challenging.
But my DirectX bookshelf has a fair number of books on it now…and none of them
are very good. Most of them are barely any good at all. The one I’ve read most
recently - The
Microsoft DirectX 9 Programmable Graphics Pipeline
– was no exception. And
I had such high hopes for it while I was checking it out in the bookstore.

The big problem I have with most books (true particularly for DirectX, but true in
general) is that they go too far in one direction or the other. Either they have almost
no information in them, or they assume that you already know the topic. The worst
are introductory texts that take an academic approach. If I wanted an academic approach,
I’d read the docs or the spec. If I want introductory, then all the formalism
gets in the way. I reviewed a book recently that seemed to have exactly this problem
– it was only a step better than a BNF description of the thing it was talking
about. It just used English instead of { | and }.

Ultimately, I want something that augments the specification/documentation, not replaces
it. A good book has to have the primary goal of setting up a functional model in my
head, so I can figure out on my own why something might work the way it does
when I encounter it in the API. It doesn’t even matter if the model is completely
accurate if the text is introductory – a good analogy that I can ditch later
when I have a more complete understanding of the domain is still extremely helpful
even when it isn’t 100% accurate.

This is how I learn, and therefore this is how I try to teach and how I try to write.
Of course, different people learn different ways, but I just don’t believe that
many are well-served by things that read like my college textbooks.

Saturday, October 11, 2003

Joel on Unicode

Joel posts another great one, this time on the
basics of Unicode
. He rightly points out that every developer should know this

But you all know this already, because you all read Joel, right? Right!?!? 

Friday, October 10, 2003

The Wedding Toast

I had heard them talking about doing this, but seeing
is hilarious.  

If you’re wondering why this guy rates a video from Chris and Don at his wedding…it’s
just because he asked. :)

Wednesday, October 8, 2003

XmlSerializer as XPath

You've probably read a lot by now about the tension between the OO purist view of the world and the XML purist view of the world. You might also have picked up that I personally find myself more at the XML end of the spectrum, due in no small part to the influence of Tim Ewald.

One day not too long ago, Tim rang me up to tell me about a revelation he had had. He used to hate XmlSerializer because it embodies the fallacy that there's an isomorphism between objects and XML. But he explained to me that if you looked at it in a different way - simply as a convenient mechanism for populating a set of variables - that it wasn't so bad. The idea being that in order to code against the XML, you're going to have to find a way to pull the data out and get it into variables. XPath is one way to do that, but so is XmlSerializer. I generally followed his argument at the time, but it didn't really sink in until recently.

I was trying to rewrite some of the configuration code I'd done for one of my clients. The issue was that we have a whole bunch of web projects, and having to make changes in 17 different web.config files was annoying. So I wanted to set it up so we could simply point to a single config file in a well-known directory.

I started off using a modified version of my XmlSerializer configuration section handler. But as I was looking at it, I realized that there was a better way. As it stands, the section handler expected XML in the config file that looks like this:



  <!-- configSections element goes here -->  


  <MyStuff type="SomeNamespace.MyStuff, CraigsConfig">


Saturday, October 4, 2003

Going to Switzerland

It’s great when your hobbies provide you with nice fringe benefits. For example, it’s no surprise to anyone that reads this blog that I’ve been studying DirectX in my spare time and writing up what I learn in the hopes that it’ll help someone else. Well, Sascha Corti saw my writings and has invited me to speak at DevDays in Zurich, Switzerland. I, of course, accepted.

I’m looking forward to meeting any of you that might be there.

BTW, I’m still working on my Managed Direct3D series, but I’ve been busy and the topic I’m currently writing about (how to deal with rude interruptions like screensavers kicking in) is making me do a bit more research than I had planned. I like to try to know what I’m talking about before I commit the ideas to paper. ;)

Wednesday, October 1, 2003

MIT Open CourseWare: Resource for Super-Villains

So MIT has this cool project where they are trying to make as much as possible of
their course material available, for free, on the web. There are even some videotaped
lectures up there. Find it here. They’ve got
over 500 courses up there now, although the quality and amount of material varies
greatly from course to course.

I was talking to Rob Engberg, a friend of mine, and he pointed out the following sequence
of courses that are available:

3.43J Integrated Microelectronic Devices
Fall 2002

22.312 Engineering of Nuclear Reactors Fall

2.75 Precision Machine Design Fall 2001

6.034 Artificial Intelligence Fall 2002


In his words: Yes...Everything a good Super-Villain needs to build an army of giant




Thursday, September 25, 2003

Java as SUV

Via Don
Box's Spoutlet
, I read about Philip Greenspun’s take on how
Java is the SUV of Programming Tools
. I find it interesting for a number of reasons:

1)      I’ve
taught short courses at about 15 universities around the country, including MIT. There
was a huge range of general competence amongst the undergrads I encountered. MIT had
the best students I ran into, although Stanford and one or two others were also excellent.
To hear Philip saying that the MIT students had a hard time with a technology does
not bode well for its general applicability at other schools. You may want to take
that with a grain of salt coming from me, though, because:

2)      I
went to MIT. :)

3)      From
the outside (of Java), Java looks almost exactly like C#/.NET to me – someone
pointing out differences in productivity is a bit surprising.

4)      This
is far from the first time I’ve heard someone describe LISP as an advanced language.
Given that I’ve been doing a bit of Emacs hacking lately, I might have to move
really learning LISP well up on my TODO list.


Wednesday, September 24, 2003

Trillian Won't Break, MSN Still Free, and Craig Still Clueless

So. Trillian has been updated - the latest version of Trillian and the patches for version 1.0 fix the issue with MSN. I still think it's an interesting shift in the landscape, but I guess I should research a bit more carefully before posting next time.

MSN IM Still Free, But Trillian May Break

Several readers pointed out to me that the Washington Post article refers to MSN Chat, not MSN IM, a completely separate service. Color me unaware - thanks to those that set me straight. But my memory was correct - there is an unrelated move to block Trillian from the MSN network: http://www.infoworld.com/article/03/08/28/HNmicrosoftim_1.html

MSN IM No Longer Free?

A sharp-eyed friend of mine spotted an article
in the Washington Post (availble
here ). It looks like MSFT will be limiting
use of MSN Instant Messenger. What I can't figure out is how this
affects me - if I have to start paying for it,
I'd rather use AIM or ICQ, which are still free AFAIK. Especially
since I use
Trillian, since that means my UI won't change at
all. But if all
the people I want to talk to are going to be on MSN IM, then that makes it a harder choice...

Does anyone know more about this?

Here's the article text:

Wednesday, September 24, 2003; Page E02

Microsoft said it is shutting down Internet chat services in most countries outside the United States and limiting U.S. service to help reduce criminal solicitations of children through the online discussions. The changes will take effect Oct. 14, Microsoft said. MSN will require U.S. users of its chat service to subscribe to at least one other paid MSN service. That way, the company will have credit card numbers to make it easier to track down users who violate MSN's terms of use.

Monday, September 22, 2003

DirectX SDK Summer Update Available

The DirectX SDK Summer Update is available now here…in
theory. I haven’t been able to download it yet. I’ll keep trying.

Supposedly, the managed docs got an update. They could certainly use it – there’s
basically nothing right now.

Thursday, September 18, 2003

Isabelle Passes Us By

I live near Washington DC, so we were right in the path of the hurricane that came
through last night. I wound up staying home from work on Thursday because I had to
help my wife make a few preparations (trimming branches that were close to windows,
for example) and I didn’t wind to get caught out at work if things got nasty.

Fortunately for us, things didn’t get nasty at our house. The power bounced
for a few seconds four or five times, but that was about it. I even felt comfortable
driving out later in the evening to return some movies to the video store. Along the
way, I saw a lot of lights out and two huge flashes that I think must have been transformers
exploding – I think close to two million people in the area are without power
right now. Yikes! Guess we were lucky on that count. Hope all of you were similarly

Wednesday, September 17, 2003

MSDN TV: Rebuilding MSDN with Angle Brackets, Pt. 1

I just noticed that the following is on MSDN TV. If you want to know what I’ve
been up to over at MSDN, this is it: we’re implementing Tim’s vision.
It’s been a blast, and has fundamentally changed the way I think about XML-based

Tim Ewald tours MSDN's new XML pipeline, examines how we use schemas, XPath, and
XSLT to implement logic declaratively. He concludes with a look at our next steps,
including more generic plumbing and exposing XML content through RSS and Web services.

Just Published