Saturday, December 31, 2005

Looking Forward to 2006, Looking Back on 2005

As is my custom, I like to take a look back at the end of the year and check out what happened. I'm bringing y'all along for the ride.

 

Reviewing my blog, I see that I had one goal for 2005: ship FlexWikiPad 1.0. Well, that didn't happen. In fact, I didn't even do any significant work on it. Why not? Because I managed to push a couple of major projects onto the stack on top of it (bet you've never done that ;). For one thing, it seems I have to write a whole new text editor control (something I also didn't finish this year). But more significantly, I decided to completely overhaul the FlexWiki internal architecture to make it possible to implement security in a sane way.

 

Of course, I didn't finish that, either, but then again I didn't expect to. It's a pretty big job to do in my spare time, and along the way, it morphed into a decision to port FlexWiki to .NET 2.0. That is actually coming along fairly well, especially now that I try to spend an hour every day working on it. With luck, I'll have something to alpha status in month or two more.

 

It wasn't all FlexWiki work this past year, though. I had paying clients, too. My work at Integic redesigning and updating their main product has, after three years, reached code complete status. That's fairly exciting for me, because as a consultant I rarely get to see a project through from the very beginning. Even more interestingly, it looks like I'll be involved in working on the next phase as well, which means I'll get to see if some of our choices wind up helping us out like we hoped they would.

 

My other work in 2005 is still ongoing. I'm developing Pluralsight's new architecture course, for one. The research for that has been very educational, as I knew it would be. Hopefully I'll be able to pull the rest of the class together over the next few months.

 

Of course, my work on MSDN2 continues as well, but I can't say much about what I'm working on just yet. When we get it in place, you'll definitely hear about it, both here and elsewhere. I think it's something that readers of this blog will find interesting.

 

As you can see, 2006 is looking to be every bit as busy as 2005 was…and then some. I'm just glad my wife will be graduating from her MBA program in the spring.

 

I'll close with a list of my favorite/most significant events/posts of the year:

 

January


 

February


 

March


 

April


 

May


 

June


 

July


 

August


  • I became a whore for rent. You should see the Google searches that post gets. :)

 

September


 

October


 

November


  • November was not a great month. Could have been worse, though. A lot worse.

  • I ran into this problem, which would have bitten me several times since if I hadn't gotten help from one of my smarter (specifically: smarter than me) friends.

 

December

Friday, December 16, 2005

sqlcmd.exe -v Rocks

I'm the first to admit that I'm no database guru, but I occasionally have need to do something beyond just a simple SELECT, and I can usually manage to fumble my way through it.

 

The other day, the thing I was fumbling my way through was restoring a database from a backup a client had sent me. While I was doing it, I ran across a neat little option to sqlcmd.exe that I hadn't seen before, but which I definitely want to remember. It's the -v option, and it lets you pass parameters to your SQL script. In my case, I wanted to pass the current directory, so I could restore the database to files in whatever directory I happened to be running. Well, putting this in a .cmd file does the trick:

 

sqlcmd -E -i restoredb.sql -v root="%CD%"

 

Then I can use the root variable in my SQL script. All I have to do is reference it with the $(root) syntax, like this:

 

RESTORE DATABASE MyDB 
    FROM DISK = '$(root)\mydb.bak'
    WITH REPLACE,
    MOVE 'mydb_data' to '$(root)\mydb.mdf',
GO

 

Nifty, eh? I know I'm far from the first one to "discover" this, but maybe it'll help someone who hasn't seen it before.

Thursday, December 15, 2005

Using a Custom WSDL File in ASP.NET Web Services

Some of the work I'm doing right now involves writing a web service that has custom WSDL. By "custom" I mean "written by hand, not generated by the ASP.NET web services infrastructure". In this case, it's because we've got a bunch of external schema types that we'd like to use in a certain way, but you might want to do the same thing for a variety of reasons, including just not liking the way the generated WSDL looks.

 

As it turns out, using an external WSDL is fairly easy. It's just a little obscure - I had a hard time tracking down how to do it using Google, so I'm documenting the process here. Following, the steps required:

 

Write a custom WSDL file.

Use the tool of your choice. The one thing you're going to want to do is to omit the <wsdl:service> declaration. This bit actually will be generated by the ASP.NET infrastructure. That makes sense, since it contains a URL, and can be more easily determined at runtime.

 

Drop the WSDL in the web directory next to your .asmx file.

Presumably, you still want people to be able to download your custom WSDL, so putting it somewhere web-accessible is a good idea.

 

Mark your web service implementation class with the [WebServiceBinding] attribute.

For instance, you might do something like this:

 

[WebServiceBinding(Name = "MyBinding", Location = "MyCustom.wsdl")]
public class MyServiceImplementation : WebService

{

// ...

}

 

The most important thing to set here is the Location property, which tells ASP.NET where your custom WSDL file lives. The Name property gives the name of the <wsdl:binding> from your custom WSDL, and is needed by ASP.NET to allow it to dispatch calls onto your implementation correctly.

 

Mark your web method with the [SoapDocumentMethod] attribute.

For example:

 

[WebMethod]
[SoapDocumentMethod(Action = "urn:foo-com:service/DoSomething",

Binding = "MyBinding")]

public DoSomethingResponse DoSomething(DoSomethingRequest request)

{

// …

}

 

The Action property associates this method with the action listed in your WSDL file. It is particularly important because ASP.NET chooses which method to invoke based on the action in the incoming message.

 

The binding element associates the operation with a particular <wsdl:binding>, in much the same way that the Name property of the [WebServiceBinding] attribute does. It’s required to be on the [SoapDocumentMethod] attribute for proper operation, but I still haven’t quite figured out why you need both.

 

Observe the generated WSDL.

If you surf to the automatically-generated WSDL page (e.g. http://server/path/to/service.asmx?wsdl), you’ll observe that ASP.NET is still generating a WSDL document for you. However, because of the changes you’ve made, that WSDL consists solely of a <wsdl:import> referencing the WSDL file you wrote, and a <wsdl:service> element containing the URL for the service. It’ll look something like this:

 

<wsdl:definitions targetNamespace="urn:foo-com:service">
 

<wsdl:import namespace="" location="MyCustom.wsdl"/>

 

<wsdl:types/>

<wsdl:service name="MyService">

  <wsdl:port name="MyBinding" binding="tns:MyBinding">

    <soap:address location="http://localhost /MyService/MyService.asmx"/>

  </wsdl:port>

  <wsdl:port name="MyBinding1" binding="tns:GetContentBinding">

    <soap12:address location="http://localhost/MyService/MyService.asmx"/>

  </wsdl:port>

</wsdl:service>

 

</wsdl:definitions>

 

Luckily, Add Web Reference respects the <wsdl:import> and (assuming you write the WSDL correctly), .NET clients will be able to generate code off of this document successfully. Other toolkits should be able to as well, but YMMV.

 

Particularly when coupled with the flexibility that IXmlSerializable gives you over the serialized XML, taking control of the WSDL is a very powerful technique.

Friday, December 9, 2005

C# Type Conversion Operators Considered Harmful

I see Fritz just posted a code sample that makes use of C# conversion operators. I don't like it. No sir, I don't like it one bit. :)

 

Here's the deal. A conversion operator is just a way of writing a method that you invoke using the C# casting syntax. The thing is, since it's a method, why not just make it a method. If I were writing this code, I'd call it ToUpdateItem() or perhaps emphasize its a factory nature by calling it something like CreateFromUpdateItem.

 

There are two problems with the type conversion operator approach as I see it. First, you're providing something that implies type compatibility, when really no such type relationship exists. Second, the conversion operator syntax implies that no new instance is created, when in fact a new object instance is created every time.

 

With a plain ol' method, people who maintain your code can see exactly what's going on. With a type conversion operator, you're just twisting the language syntax to make it look like one thing is happening, when really something else is.

 

A good rule of thumb is that if you see yourself typing "public static operator", you're doing something wrong. :)

Friday, December 2, 2005

FlexWiki 1.8.0.1696 Released

I just posted the latest version of FlexWiki over on SourceForge. You can download it here. Here are the release notes:

 

This release is primarily a bugfix release, although several minor
features have been added. Basically, the fixes were piling up, and we
were telling people "go download the latest interim build" often
enough that we decided just to make it official. Highlights of this
build include:

 

* At long last the death of the "incorrect wiki links" bug, where
  FlexWiki emitted links that corresponded to the first hit after
  startup. This most frequently manifested itself as links pointing to
  "localhost" even when the web server was accessed remotely.
* Null edits are now ignored.
* Slight performance enhancements, including the ability to turn off
  performance counters (which can cause large delays on some
  installations) via the DisablePerformanceCounters switch.
* Form support in WikiTalk

 

And much more.

 

Thanks to everyone that contributed, both those who wrote code and
those who reported bugs!

 

Please direct inquries about FlexWiki to the FlexWiki users mailing
list at
flexwiki-users@lists.sf.net.

I definitely recommend that you upgrade to this release if you're currently running 1.8.0.1677.

 

I'm guessing this will be the last official release of FlexWiki before 2.0. (You can always get unsupported interim releases from http://builds.flexwiki.com.) I've been working away in my spare (ha!) time on gutting and rearchitecting the content engine, and I've been making slow but steady progress. It'll probably be a few months before I get it to alpha status, but once there I should be able to start making rapid progress on implementing security and some caching improvements.

Sunday, November 27, 2005

Triple Whammy

It's been a bit quiet around here, but there's been good reason. As you know, my friend Alex got hit by a car. The next weekend, I sliced my thumb up with a table saw. Well, the weekend after that, the worst of the lot happened: I got a call from one of my brothers telling me that my dad had lost consciousness while returning from a vacation.

 

They say things happen in threes, and I certainly hope that's true, because after this one I'm ready for a break from hospitals. It turns out that my dad had an abdominal aortic aneurysm (a "triple A"). Briefly, this means that the large artery that leads downward from his heart had weakened and swelled. I saw the CT scan at one point: it was roughly the size of an American football.

 

My dad got lucky on several counts. To start with, he'd just traded places with my mom when he passed out, so he wasn't driving. That could easily have killed them both. On top of that, even though they were in northern Minnesota (which is basically the middle of nowhere), they happened to be near a hospital, so the ambulance was able to get him to a doctor quickly. And the doctor was good enough to get the diagnosis right and immediately put my dad on a helicopter for a hospital in Duluth. And the vascular surgeon in Duluth managed to repair my dad's artery, saving his life. Dad's blood pressure dropped to zero at one point, so it was a near thing.

 

The whole family descended on Duluth, of course. Dad was in the ICU for a week, but has since been transferred to the cardiac recovery unit. He's still in Duluth, but the rest of the family has started returning to their homes now that the largest dangers are past. We're all hoping that he'll be home by Christmas when we all return, although it's no sure thing.

 

One of the worst parts about the whole process was that my dad was so completely looped by the drugs and the shock and the sleep deprivation that he was basically delirious until about nine days after the surgery. And it wasn't until about eight days out that anyone was able to tell us it wasn't a stroke. It was pretty hard on my mom to hear my dad basically babbling nonsense about how he wasn't really in a hospital. Fortunately, sleep and changes of medication fixed that up and dad is now as sharp as ever.  

 

We spent the Thanksgiving holiday there in a hotel. My wife and my brother did a bang-up job of whipping up a first-rate giant meal using only a microwave and what they could find at the grocery store on Thursday afternoon. Still, I can certainly think of better ways to spend the holiday. But I can much more easily think of worse ways.

 

Here's hoping my next post talks about something boring like what I'm working on over at MSDN.

Thursday, November 10, 2005

Don't Argue With Power Tools

I think maybe the hospital I live near should build a tunnel from my basement to the ER. It's only about two blocks anyway, and I've walked over there four times in the last two weeks. The first three times were to visit my friend Alex, who got hit by a car. This latest time, though, was all me. Nothing quite so serious, but nonetheless not an experience I hope ever to repeat.

 

I was in my basement on Sunday, working on building a bench for my new planer. I've been buying a lot of tools lately, and it's starting to get a bit crowded down there, so I figured I'd work on some storage. So I was running a 2x4 through the table saw to cut it down to size.

 

I'm not entirely sure what happened next. My memory is a bit hazy for those few seconds. But I think I was concentrating on keeping one of my hands away from the blade, and didn't pay quite enough attention to the other. What I do remember pretty clearly is jumping away from the saw, yelling loudly, and clutching my thumb.

 

Before I go too much further, let me say that I did not sever any digits. I gave myself a really nasty cut across the pad of my thumb, but I didn't slice anything completely off. I'd say it's about a five or six on a scale of one to ten, where one is a paper cut, and ten is the aforementioned digit removal. The nurse at the ER (where I spent the next four hours) gave me an "A+", though. :p

 

Here's a picture of the rather nifty bandage they gave me.

 


 

They had to put it on with a special tool - it was sort of a cage that stretched a tubular piece of gauze out, and they fed it up and down, creating layers. Very clever, I thought.

 

If you want to see what it looks like underneath the bandage, well, what's wrong with you?!? :) But click here and check it out if you must. Warning to those with a weak stomach: it's gross. Warning to those with a highly developed photoaesthetic sense: the picture is poorly taken.

 

Anyway, it looks like I'll heal up in a few weeks. Right now, I can't feel the tip of my thumb, but I'm hoping that will come back. In the meantime, it hasn't interfered with my typing too much, and that's the important thing, since it's how I earn my bread.

 

The weirdest thing about the whole experience has been observing the mental and physical effects. I've never really hurt myself seriously before, so it was odd to feel myself going through the typical reactions. First was physical shock - I was disoriented and sweating for a few minutes right after. And then of course there was disbelief ("there's no way I just turned my thumb into a tiny pile of hamburger!"), anger (as I was walking to the ER I was super pissed that I'd been so stupid), depression (not the real kind, but I was a little down about it for a bit), etc.

 

All in all I'd say I was dumb but lucky. Which beats dumb and unlucky or even smart and unlucky. In the future I'm hoping to be smart and lucky. I also hope to return to amateur carpentry (sans the amateur surgery), but one thing I'm not sure about yet is whether this whole incident has dampened my enthusiasm to the point where I'll just find something else to do any time it's time to turn on the power tools. But if I do go back to it (as I intend to), I know this: I'm going to incorporate the piece of wood that I was cutting into the final product. Preferably with the bloodstain showing. Just something from my own twisted sense of revenge I guess. Or stubbornness.

 

So, woodworkers: be careful out there. Pay attention, build or buy whatever jig you need to make the cut safely, and think before you act.

 

Oh, and if you see my parents, don't tell them. :)

Friday, November 4, 2005

My New Favorite VS2005 Key Sequence

Another one in the "I might be the last one to figure this out" department. But it's cool, so I want to share it.

 

I'm a keyboard guy. That's why I love discovering new shortcuts. Today, when working in VS2005, I accidentally hit ctrl-alt-down arrow. So I was surprised when up popped a scrollable list of my open windows. You can even type the first few letters of the filename to navigate the list. I like it better than ctrl-tab when I know the name of the window I want to open, because it doesn't make me visually scan a big unordered list. When I know I want the last window, or the one before that, ctrl-tab still wins.

Thursday, November 3, 2005

What Has Craig Been Working On?

The other day I told you about some of the things that have been up in my personal life. I also said I'd post about what I've been doing for work. Ah - the classic blogger conceit: the surety that you care. :)

 

At any rate, right now I have four projects going simultaneously. Balancing them has, at times, been tricky, but so far I've managed to keep all of them moving forward without having to work nights or weekends. (Wife in school plus toddler means I couldn't really do that anyway.) I'll describe each of them briefly.

 



  1. e.POWER 2005. For the last three years, I've been helping my client Integic port part of their e.POWER workflow product to .NET. At the same time we move existing functionality, we've also been adding a web services façade, with the obvious goal of making it easier to integrate e.POWER workflow services into their applications. I've been acting as an architectural consultant, advising on how to design and implement the system, but I also wrote big chunks of the security system, pretty much all of the database access library, and pretty much all of the build system, as well as a bunch of other bits and pieces. We're nearly ready to ship it, and I'm eager to see what happens when the bits go into action at a real, live customer. I think the answer is "good things", but the proof is in the pudding.

  2. FlexWiki. No surprise here - if you read this blog, you know that I've been doing a massive refactoring of the FlexWiki codebase, which includes a port to .NET 2.0. The end goal is to implement good application-level security in FlexWiki, but to do that I'm going to wind up rewriting most of the core of FlexWiki. When I'm done, I hope it'll be much more extensible and quite a bit less tangled than it is now. Since no one's paying me to do this one, it has the lowest priority, but I'm still trying to devote an hour to it every day, with the hope that I can get something back into the community some time in early 2006. We'll see how I do on that one.

  3. Pluralsight's Applied .NET Architecture & Design Course. The sharp-eyed have noticed that Pluralsight have announced several new courses, among them mine. I'm in the middle of writing the slides now, and it has been very challenging, mostly due to the nature of the material. How, exactly, does one teach "architecture" in four days? What does a lab exercise for a lecture entitled "service orientation" look like? Despite the challenges, I'm very pleased with the way the course is shaping up, and I hope to see a few of you in class once we start teaching it.

  4. MSDN2. Microsoft has brought me in to do some more work on the MSDN2 project, which has really matured since the Alpha release I helped write. I've got some really interesting stuff to talk about on this one, but I'm going to wait until a few things come together. Expect a series of posts in the coming months related to my work here.

 

The fact that I generally put time in on at least three of them most days makes for a lot of variety. I particularly like that I've got a mix of design and implementation going - nothing helps you figure out how to build something more than trying to build it (or something similar).

Wednesday, November 2, 2005

Unwanted Stack Trace in Web Services

At one of my clients, I've been helping them port their product to .NET. As part of the process, we've added a web service façade that exposes a good chunk of their functionality. It's been really interesting figuring out how to do that correctly. One of the things I recommended is that they put a catch (Exception) block in every [WebMethod].

 

The idea here is to control completely what goes out on the wire. If uncaught exceptions are allowed to propagate outside the WebMethod, the pipeline will call ToString() on the exception object and stick the result into the soap:fault. That means that we're projecting stuff like our call stack and the exception type onto the wire. This is neither useful nor desirable to the clients of the web service. Even if it was, the format the information is in would be a pain to parse.

 

So instead, we've done something like this:

 

[WebMethod] FooResult DoFoo(FooRequest request) {

   try {

      CheckSecurity();

      return ExecuteFoo(request);

  }

  catch (Exception e)

  {

     throw LogAndMapException(e);

  }

}

 

Where LogAndMapException looks something like this:

 

SoapException LogAndMapException(Exception e) {

  LogException(e);

 

  CustomException ce = e as CustomException;

 

  if (ce != null) {

    return CreateSoapException(ce.Code, ce.FriendlyMessage);

  }

  else {

    return CreateSoapException("generic error message");

  }

}

 

CustomException is an exception that we wrote that is thrown when application-specific error conditions occur. When we get this, we know we have information that might be useful to the client, for example, "You do not have permission to do what you tried to do." In this case, we create a SoapException that contains a code and a convenient human-readable string for that error. In all other cases, we throw a SoapException that just contains a generic error message. So we return as much information as we have, but not more than the client needs.

 

The key here is that when SoapException is thrown, the pipeline knows that you're trying to make the soap:fault look a particular way, and it'll map the properties of the SoapException into the fault message, rather than simply dumping the result of ToString() into it. Very handy. Except when it stops working.

 

Which is exactly what happened to us the other day. Since we have a UI that displays the human-readable part of the fault in a web page, it was pretty obvious to us that we were getting the stackdump style of message, rather than the nicely formatted one. To add to the mystery, we couldn't recreate the error on any of our dev machines - only on the web farm we deploy to as part of every build.

 

But wait! There's more! There are two nodes in the web farm (we'll call them A and B), and as I was playing around with the setup, I found that the problem only occurred when I pointed the web UI on A to the web service on A. If the web UI on A talked to the web service on B (and vice versa) the problem went away. Even weirder, if I set up tcpTrace as a simple call redirector so that web service calls went from machine A to machine C back to machine A, things also worked correctly. And I tried addressing the calls using the machine's full name, "localhost" and "127.0.0.1", but none of that made a difference. As long as the call didn't leave the machine, the error text was wrong.

 

After much screwing around trying to get remote debugging to work (with little success, as usual with that technology), I asked a few friends if they'd seen this before. Fortunately, the inestimable Henk had. (Henk is one of the smartest guys I know - hire him if you ever get the chance.) It turned out to simply be a matter of adding

 

<customErrors mode="On" />

 

to the web.config for the web service. Poof! Problem gone. I suppose it makes sense in retrospect that this tag would control pipeline mapping of unhandled exceptions for web services the same way it does for ASP.NET web forms, but frankly I found the resulting behavior highly unintuitive. I also can't explain why it only failed on our Windows 2003 web farm, and not on any of the other Windows 2003 or Windows XP machines we tried. But at least we're up and running again.

Tuesday, November 1, 2005

Sony CDs Install a Rootkit on Your Machine

This is one of those "horrible, but sadly unsurprising" stories. I think Bruce Schneier has it right when he suggests that companies should be made liable for security vulnerabilities that they introduce into your system.

Monday, October 31, 2005

Long Weekend

I had a fairly long weekend, in both senses of the word. I can't complain, though: it was much better than others', as I'll explain in a moment.

 

Friday was my daughter Ellen's first birthday. It's hard to believe it's already been a year; my aunt used the expression "Long days, short years" and I have to say I find it apt. Ellen is an absolute riot. I swear she gets more fun to hang out with by the hour.

 

My wife is getting her MBA at Wharton, and Friday was one of her class days. So I planned to take Ellen up to Philadelphia so she wouldn't have to miss Ellen's birthday. Additionally, Alice's parents were staying with us, and I need to drop them off at the airport that same morning. Ellen and I took the train, so already a fair amount of running around was on the schedule, especially given our plans to co-host Ellen's joint birthday party on Sunday with some friends whose daughter was born just a week after Ellen. No big deal, though - just par for the busy parent course.

 

The level of insanity jumped from "parents of toddler" to "Rory" at about 2AM Friday morning, though. That's when we got a call from the police letting us know that our good friend Alex had been hit by a car. He was riding his bike home from helping a friend move, and the dirtbag that did it drove off without stopping, leaving Alex lying in the street.

 

Alex is a triathlete, who has competed in several Ironman events just this year, including the big one in Hawaii. If you're not familiar with Ironman (Wikipedia entry), it consists of a 2-mile swim, followed by a 112-mile bike, followed by a 26.2-mile run. That's right - it ends with a full marathon.

 

Anyway, all this is to say that Alex has probably biked almost 7000 miles just this year, all of it with a helmet. And of course, this time, on a short one-mile trip home, he wasn't wearing one. Fortunately, Alex was not killed. He didn't even suffer any head injuries. He did get hurt, though, and not in a small way: several ribs broken, two seriously; a broken collarbone; and worst of all two fractured vertebrae, one of them basically crushed.

 

The prognosis for Alex's full recovery is quite good. Even now, just three days later, he can walk, even up and down stairs, albeit very, very slowly and with a cane. He'll need to wear a back brace for several months, and it takes him a long time to get in or out of a chair or a bed right now. We've all wondered where he'd be at if he weren't in essentially perfect physical condition before he got hit.

 

Getting hit by a car will generally always ruin your day, but this already hasn't really been his year. He lost his job after getting screwed by an unscrupulous consulting company, his visa to stay in the US is expiring, his landlord is selling the house he's living in, and perhaps worst of all he has no health insurance. Yikes.

 

He was released from the hospital yesterday. Of course, we insisted that he come to stay with us. It works well because either Alice or I are basically always at home. Should anything happen, he won't have to lie at the bottom of the stairs for six hours waiting for someone to come home. And we're glad to help.

 

Anyway, like I said, a crazy weekend. I had planned to do a post about what I've been up to work-wise, but when you work at home a lot, your work and personal lives tend to blur together a bit. So I thought I'd start here - sometime soon I'll do the complementary post.

Monday, October 24, 2005

Master of Pages

My buddy Jason Whittington wrote this fantastic set of lyrics to the tune of Metallica's "Master of Puppets". Check it out:

 

Master of Pages
==========
 

In your darkest day, Child Controls array
Try to use SqlConnection
Children disappear, code-behind not clear
Too much Stylesheet distraction

 

[chorus:]
Write me and you see
I am all you need
You provide the views
I am calling you

 

Your site is plastered
Obey your master
Your page loads faster
Obey your master
Master

 

Master of pages is pulling your strings
Formatting output, displaying your themes
Without placeholders you can’t see a thing
Just call my name, 'cause I’ll hear you scream
Master!
Master!

 

Just call my name, 'cause I’ll hear you scream
Master!
Master!

 

Try to do your bit, Handle PreInit
Binding to a data table
You've been led astray, data won't display
Worker process grows unstable

 

[chorus]

 

Master, master, where’s the page that I’ve been after?
Master, master, now my server dies

 

Laughter, laughter, all I hear and see is laughter
Laughter, laughter, laughing at my cries

 

Now your viewstate's fat, didn't use the cache
Do you really need more reason?
You're lost in the maze, Come to righteous ways
.NET 2.0's in season

 


I will lead the way
I will make you pay
You will start anew
Now I rule you too

Saturday, October 22, 2005

Galleon + Tivo = Nice

I noticed recently that my Tivo started displaying a new menu item: "Enable Home Network Applications". Encouraged by the implications, I did a a little research and I ran across Galleon at SourceForge. I downloaded and installed it, and I have to say, it's ever so nice.

 

Despite a clunky interface typical of open source (and of Java apps), I didn't have too much trouble figuring out how to set up true two-way integration between my PC and the Tivo. I can now move shows back and forth, controlling the process from either the Tivo or my PC. This basically allows me to use the PC as both an archive/backup and a second TV. You can even set up rules so that certain shows are automatically downloaded to the PC.

 

Of course, I'm only scratching the surface of Galleon. It's really more a Java-based platform for writing custom Tivo apps than it is a transfer program. I've played around a bit with a few of the plugins that ship with the program, but haven't fully plumbed the possibilities. So far, my favorite is the podcast plugin. I like being able to listen to any episode of Science Friday in the living room on demand - it brings the Tivo experience to radio.

Friday, October 21, 2005

IList&amp;lt;T&amp;gt; Won't XmlSerialize!

So as long as we're talking about XmlSerializer bugs, let me point out a really, really bad one: if your type has members of type IList<T>, it won't serialize! Aagh! What's worse is that Microsoft has decided not to fix this bug until after 2.0. Crap!

 

OK, I can sort of understand that not everyone uses XmlSerializer directly as their XML API of choice (as I do). So maybe if people like me were the only customers who would be affected by this problem they'd be justified in saying "we'll fix it later". But I'm not: XmlSerializer is used by the ASP.NET web services infrastructure. So if you were thinking of using IList<T> properties from your web service parameter types…forget it. Won't work. Guaranteed to be busted in v2.0.

 

Of course, it's not the end of the world. Generated collection classes (I use CodeSmith) have been an option for typesafe collections since v1.0. But generics were supposed to free us from those, not just give us a slightly more efficient way to implement them.

 

This bug has me pretty annoyed right now, but maybe I'm just moving my way through The Circle of (Software) Life.

 

Update: OK, this isn't quite as bad as I thought. It turns out that List<T> will serialize just fine. However, FxCop complains about List<T>, and if you've ever tried to use FxCop, you know that dealing with exclusions is often more trouble than it's worth. Fortunately, the FxCop violation that got triggered suggested using either Collection<T>, ReadOnlyCollection<T>, or KeyedCollection<T> instead. As you know, I've already worked a bit with KeyedCollection<T>…suddenly System.Collections.ObjectModel seems even cooler.

 

Anyway, once I switched over to using Collection<T> (the appropriate choice in this case) everything serializes correctly. Woohoo! I'd be happier with support for IList<T>, but I can totally live with this.

 

I hope you don't mind my constant updates. I was going to be embarrassed about having to constantly correct myself, but then I remembered that this blog is nothing if not a public record of the fact that I still haven't managed to learn everything there is to know. :)

Wednesday, October 19, 2005

Serializing a KeyedCollection

The other day I pointed out System.Collections.ObjectModel.KeyedCollection. Towards the end, I claimed that you could use XmlSerializer to serialize an instance. Well, it turns out this is only true if you use System.String as the key type. For some reason, the serializer barfs with "The given key was not present in the dictionary" when you try to serialize a collection that's keyed off of any other type. Or at least, any of the other types I tried.

 

I'm not sure if this is a serializer bug, an issue with KeyedCollection, or if it's a by-design limitation for some subtle reason. I know I'd sure like it if it started working in the RTM version of Whidbey.

 

Update: Astute readers Aaron and Bart figured out that this is due to a bug in XmlSerializer (a bug which made #1 on the MSDN Product Feedback Center). Apparently XmlSerializer calls the object IList.this[int index] indexer, rather than the type-specific indexer. Fortunately, there's an easy workaround: just add an new indexer which hides the indexer in the base class. Modifying my earlier example, you'd do this:

 

class People : KeyedCollection<string, Person> {

   protected override string GetKeyForItem(Person item) {

      return item.Name;

   } 

 

   public new Person this[int index] {

      get { return ((IList<Person>)this)[index];

   }

}

 

I've put the new code in bold. Of course, it already works when string is the key type, but you get the idea. The key here is the "new" keyword on the method, which makes this indexer hide the indexer in the base class, causing it to be used instead.

 

I've included a more complete sample below that uses a DateTime (birthday) as the key type instead of a string. Note that (as far as I can figure so far) you can't use an int as the key type, because that conflicts with the positional indexer already exposed by IList<T>. And it makes sense - if I were representing age as an integer, and my age were 33, how would I know whether people[33] meant "the 33rd person" or "the person with age 33"?

 

using System;
using System.Collections.Generic;
using System.Collections.ObjectModel; 
using System.IO; 
using System.Text;
using System.Xml; 
using System.Xml.Serialization; 

namespace SerializeKeyedCollection {
    class Program {
        static void Main(string[] args) {
            // Set up a collection with two people in it
            People people = new People();

            Person craig = new Person();
            craig.Name = "Craig";
            craig.Birthday = new DateTime(1971, 11, 29);

            people.Add(craig);

            Person alice = new Person();
            alice.Name = "Alice";
            alice.Birthday = new DateTime(1973, 11, 3);

            people.Add(alice);

            // Serialize the collection to a string
            StringBuilder stringBuilder = new StringBuilder();
            StringWriter stringWriter = new StringWriter(stringBuilder); 
            XmlSerializer ser = new XmlSerializer(typeof(People));
            ser.Serialize(stringWriter, people);

            // Print out the serialized collection
            string xml = stringBuilder.ToString();
            Console.WriteLine(xml);

            // Deserialize the collection to show that deserialization works
            StringReader stringReader = new StringReader(xml);
            people = (People) ser.Deserialize(stringReader);

            DateTime birthday = new DateTime(1971, 11, 29);
            Console.WriteLine("Person with key 11/29/1971 is {0}", people[birthday].Name); 
        }
    }

    public class Person {
        private DateTime _birthday;
        private string _name;

        public DateTime Birthday  {
            get { return _birthday; }
            set { _birthday = value; }
        }

        public string Name {
            get { return _name; }
            set { _name = value; }
        }
    }

    public class People : KeyedCollection<DateTime, Person> {
        protected override DateTime GetKeyForItem(Person item) {
            return item.Birthday; 
        }

        public new Person this[int index] {
            get { return ((IList<Person>)this)[index]; }
        }
    }
}


 

Friday, October 14, 2005

Thread.IsBackground

I've been doing some code reviews recently, looking in particular for threading issues. It's particularly important in this case because the code in question is a service that needs to run for a long time under high load. It also needs to shut down gracefully if something wonky happens.

 

One of the things I noticed was some weirdness in the OnStop method of the service, where it iterates through its child threads and tries to get them to shut down gracefully, calling Abort if they won't. Since it turns out that everything important is protected by a database transaction anyway, I suggested a shift to a slightly different model: set it up so it doesn't matter if the threads stop at all.

 

Part of doing this involves changing the threads to run as background threads. A background thread is represented by a System.Threading.Thread object like any other thread in the CLR, but it has had its IsBackground property set to true. When you do this, you're saying, "It's okay if the process shuts down while this thread is still running." You may have encountered this already if you've ever used the .NET thread pool for anything - all of its threads are background threads, so even if one of them is still doing something, as soon as your last non-background thread goes away, the process dies.

 

Of course, every effort should be made to ensure that the threads exit cleanly, since proper cleanup is better than whatever automatic cleanup you get when a process dies. But this change should at the very least help the service from hanging on shutdown, which makes it easier to support.

Tuesday, October 11, 2005

System.Collections.ObjectModel.KeyedCollection

I spend a lot of time dealing with collections when I code: should this thing be a list? An enumeration? A hashtable? Decisions about data structures abound. Which is one of the reasons I like Whidbey so much - the new support for generics makes life in the collection world ever so much nicer. But the other day I was poking around and came across System.Collectons.ObjectModel - a sister namespace to System.Collections.Generic, and one I hadn't heard much about. But it makes the story even better.
 

There's not a whole lot in the namespace at the moment. Just three classes: Collection<T>, ReadOnlyCollection<T>, and KeyedCollection<T>. Collection<T> and ReadOnlyCollection<T> are pretty obvious - they provide simple collection and read-only collection wrapper functionality. Handy, but my favorite is KeyedCollection<T>.

 

KeyedCollection<T> is an abstract base class that gives you that Hashtable/ArrayList crossover class you've probably written at least once in your life. That is, it gives you list semantics, but adds a non-integer indexer for doing lookups in the collection. So, for example, if you have a Person class that looks like this:

 

class Person {

  private int _age;

  private string _name;

 

  public int Age { get { return _age; } set { _age = value; } }

  public int Name { get { return _name; } set { _name = value; } }

}

 

All you need to do is create a collection like this

 

class People : KeyedCollection<string, Person> {

   protected override string GetKeyForItem(Person item) {

      return item.Name;

   } 

}

 

where the implementation of GetKeyForItem is whatever makes sense. It's the only method you need to override, because without it, the collection has no idea what the key is. But once you do, you can now access the collection as follows:

 

People instructors = Pluralsight.Instructors; // Acquire a collection from somewhere

 

Person craig = instructors[7];        // Can access by index

Person keith = instructors["Keith"]; // Or by name

 

In addition to being convenient, the documentation also suggests that KeyedCollection was implemented to provide roughly constant-time lookups by either index or key regardless of the size of the collection. Whether that means "consistently fast" or "consistently slow" I haven't measured yet. :) But I think the best part might just be that the class you derive from KeyedCollection is serializable via XmlSerializer, despite having dictionary-like semantics. That's basically the end of the old "Damn! I can't serialize a Hashtable" problem.

Monday, October 3, 2005

.NET 2.0 It Is, Then

Based on the feedback from my question about upgrading FlexWiki, it looks like we're go for throttle-up to a .NET 2.0 deployment. I'll start making the switch today. Of course, it'll be quite a while before I have anything I can go public with (or that even compiles for that matter), but I think this is the right move. If nothing else, it gives me an excuse to spend more time working with the new bits.

Wednesday, September 28, 2005

To .NET 2.0 or to .NOT 2.0

In my spare time (ha!) one of the things I've been doing is redesigning a pretty significant chunk of the FlexWiki engine. I basically had to because I wanted to add security, and the current design isn't extensible enough to do that without causing a major mess.

 

As I've been going through the code, it has occurred to me that I could benefit from some of the features that .NET 2.0 has to offer. So I'm thinking of upgrading. Here's the catch: it would mean that when I ship the code, everyone's going to have to install .NET 2.0 if they want to use the new version. Hence my dilemma.

 

See, the tricky part is figuring out how big a deal it is to say, "You must have .NET 2.0 to use FlexWiki." Eventually, of course, it won't be a big deal at all. But right now, it's actually a showstopper, since (AFAIK) you can't go live on the version that corresponds to VS2005 RC1.

 

Of course, there's basically no chance I'll finish before .NET 2.0 ships. It's too much work, and I'm progressing too slowly for it to be done before November, so I'm not too worried about the licensing situation. But the question remains: how long after November is long enough? Zero days? A year? Obviously, that answer is going to differ from individual to individual, so the challenge is to figure out an answer that's "right enough".

 

At the moment, I'm thinking that the right thing to do is to go ahead and upgrade. There are a number of reasons why I think this will work. For one thing, like I said, it's not going to be done for a while. For another, there's nothing that says people have to upgrade - they can install it whenever .NET 2.0 becomes an option for them. And finally, because FlexWiki is a web application, and because ASP.NET 1.1 and 2.0 coexist peacefully, the upgrade should be possible for most people.

 

I'm not going to do the conversion until at least October 1st. I announced my intention a few days back on the FlexWiki Users Mailing List, but I thought it might be a good idea to mention it here as well. If you have comments or advice, feel free to leave a comment or contact me.

Tuesday, September 20, 2005

IIS + Skype = Oops

My wife and daughter have been in Asia visiting relatives for the last couple of weeks. Fortunately, she has access to a fast Internet connection and a webcam, so I've gotten to see my girls most days. That's particularly nice when your kid is at the age where she's learning new things pretty much every day.

 

We've been using MSN Instant Messenger for our video conversations, and all in all I've been pretty pleased. It works well enough. My only complaint is the fairly high audio latency - that can lead to awkward conversational collisions. We've gotten pretty good at not stepping on each other, but I thought maybe I'd try a separate VOIP application, hoping that they were doing something with QoS that would get the latency down.

 

My first stop was Skype - I don't know much about VOIP applications, they're free, and they're a big enough name that I felt reasonably okay about installing it. Unfortunately, it completely failed to work - we'd get connected, but then something would happen on Alice's parents' computer and her connection would drop. Oh well - we just went back to the high-latency voice of MSN. She'll be back before we have a chance to try anything else out. Maybe when I start traveling to teach we can explore other options.

 

Well, that would have been the last I thought of it, but a few days later, as I was doing some FlexWiki work, I had some IIS problems. Specifically, I was unable to start my website. Oddly, I could still browse to the URLs I was working on, but all I got back was an empty document. By this point, you've probably figured out what I hadn't: another application was listening on port 80, preventing my website from starting, and serving up bogus documents in response to my requests. Obvious to most perhaps, but I had to google this KB article before I figured it out.

 

One "netstat -ano" later, and I'd discovered it was Skype that was hogging the ports. Sure enough, there was a checkbox in the options labeled "use ports 80 and 443 as alternatives". Unchecking it (and restarting Skype) fixed the problem. Good thing, because I was not looking forward to reinstalling.

Thursday, September 8, 2005

Empathy


When my daughter Ellen was about 6 months old, my sister Kristin came to visit. Ellen was very shy at that point, and often wouldn't let Kristin hold her, especially if my wife was anywhere within earshot or eyesight. That week, Kristin got stung by a wasp at a barbeque we were having, and was obviously in pain. I was holding Ellen and trying to help Kristin. Ellen reached out her hand and touched Kristin, for all the world looking like she was trying to make Kristin feel better. We'll never know if it was just a coincidence, but this article suggests that perhaps it was true empathy on Ellen's part. Interesting stuff.

Thursday, September 1, 2005

I Want to Learn Everything (But Can't)

Clemens, I totally sympathize. Even given that I'm about to transition to a more academic mode, I know I'll never even start with all the technologies in which I'd like to become an expert. This really struck me today when I was looking at some VS2005 functionality for a client. The sheer weight of new features that I wanted to check out was crushing: Indigo support, Avalon support, VSTS integration, web project improvements… And the underlying technologies themselves: WinFS, XML and Web Service innovations, generics…and that's not even counting all the other stuff I'm interested in, like Ruby and LISP. It's just not humanly possible to get to it all. (For me, at least: maybe someone smarter could manage it.)

 

Oh well: while I can't possibly learn everything, at least it'll be fun to try. :)

Wednesday, August 24, 2005

Typing Accents


 

I was working on reformatting my résumé a little bit today, and I wanted to change my MSN IM tag to reflect the fact. So I set it to "Resume" and went off to work. Of course, all that did was draw a smartass remark from Ian about me powering my brain back up after suspending it. :)

 

While we were chatting, I finally remembered the key sequence that would have let me type "résumé" instead of "resume". It's control-' followed by e. Similarly, control-` followed by a letter renders the other accent (although not for all letters). And control-~ a gives ã.

 

Two interesting things about this:


  1. Ian's keyboard gave reversed accents from mine for the same key sequences. But he's in the UK - maybe this has something to do with driving on the wrong side of the road. :)

  2. These key sequences don't work in all text controls. So far, from what I've tested, it works in Word and in MSN IM, but not in notepad, notepad2, nor InfoPath.

 

At any rate, I'm sure I could have found this on Google, but now I know where to look. Also, hopefully blogging this will remind me to make sure I put support for it into my TextEditor control.

Centering with CSS

After all that "I'm so great" in my post yesterday, I figured today would be a good time to show off how little I know about most things. :)

 

I'm an HTML/CSS neophyte. I can manage a basic page okay, but fancy-pants layout is well beyond me. CSS, in particular, is a topic whose further reaches I've never explored. Heck, I've barely wandered a few feet from its parking lot. (Although the FireFox EditCSS extension has made my HTML look loads better by making it easy to experiment.)  

 

One of the things I always have trouble with is figuring out how to center stuff. I'll slap a text-align: center on things sometimes, but that doesn't always accomplish what I want. So today I ran across this little nugget:

 

.centered80

{

  position: absolute;
  left: 10%;
  right: 10%
}

 

It gives me the ability to center something in the middle 80% of the page. It worked so well I wanted to record it here for my own future reference. I'm sure anyone that has spent any time with CSS already knows this (or a better version), but at least now I can find it again when I need it.

Tuesday, August 23, 2005

Transitions (Subtitle: Whore for Rent)

Update: fixed minor typo.

 

Keith just outed me, so I'd better come clean. :)

 

I've been an independent contractor for about the last six years. I love it. One of the great things about it is the flexibility: my schedule is flexible, my working arrangements are flexible, and indeed my very career track is flexible. Of course, all that flexibility comes with a price - I have to self-direct, and sometimes you wind up on the bench for a bit. But I don't mind those.

 

One of the very positive outcomes of all this flexibility has been the ability to alternate between what I think of as an "academic" track, and what I think of as a "practical" track. In my "academic" mode, I spend a lot of time sitting around and looking through docs, playing with betas, and generally examining all the corners of a particular piece of technology. In my "practical" mode, I help my clients build real systems.

 

I can't imagine doing just one or the other. If I were purely academic, then there are all sorts of things that I'd never be forced to do, like write a real build script, fix really difficult bugs, or, generally, ship a product. And if I were purely practical, then I'd almost always be constrained by time pressure to only learn the parts of technology that I actually use, as they became needed. That sort of need-driven education tends to leave big holes in your knowledge, and you never know when the stuff you don't know might come in handy.

 

Teaching for DevelopMentor was a great way to live the academic life. The fear induced by the thought of standing in front of an audience and not knowing what the hell you're talking about is an excellent motivator. I learned tons of things I never would have, simply because I wanted to know the answer in case anyone ever asked me.

 

At the same time, not being forced to apply all that theoretical knowledge means that you can't fully appreciate it. You can think all you want about things like the performance of the .NET garbage collector, but there's no substitute for experience to tell you that in the real world you hardly ever care about it.

 

For roughly the last three years, I've been in a almost purely practical mode, and it's been great. I've worked on a bunch of open source projects (most notably FlexWiki), contributed to the MSDN2 rewrite, and helped my client Integic with a massive .NET port of their workflow product. It has all been highly educational. But the pendulum has swung, and now I miss the academic side of things, particularly now that we're so close to the release of the next version of .NET. So I'm making a change.

 

I'm proud to announce that I will once again be taking on the role of itinerant instructor, this time with Pluralsight. It's a first-rate organization, and I'm really looking forward to working with these guys again. I'm not entirely sure what class(es) I'll be teaching yet, but I'm definitely eager to get to it - diving deep into the details of something and then sharing it with my students.

 

The timing on this is pretty good. I'm ramping down with my existing client as their ship date approaches, so it's a good time to change gears. Which is where you (and the subtitle of this post) come in: I'm also looking for work. I'll be teaching, of course, but that's only a fraction of my time, and as I explained, I don't think I can be the best instructor I can be unless I'm continually applying the knowledge I've gained in the real world.

 

Here's the part where I shamelessly promote myself (you can skip to the last paragraph if you find this sort of thing embarrassing):

 

I graduated with a simultaneous BS/MS in 1995 from MIT. I've got about five years experience working with .NET. I consider my specialty to be the design and implementation of large-scale distributed systems. I'm quite good with C#. I know my way around XML, including web services technologies and XSLT. I've written for MSDN Magazine. I'm a mean debugger. I learn very quickly. I helped rewrite a significant chunk of the fourth biggest website on the planet. I've spoken at conferences around the world on topics from Visual Studio to Direct3D. You can find my resume here.

 

So, if you've got a project you think I could help with, please do contact me. And if not, hopefully I'll see you in class!

 

Like I said, I love being an independent contractor; it makes for interesting transitions. We'll see where this set takes me next!

Sunday, August 21, 2005

Decade++

A few weeks ago, I blogged about a problem with SourceForge's bug/feature request tracker export feature. Specifically, they aren't escaping illegal characters in the generated XML. Very bad, particularly because it renders the output useless to conforming XML parsers like the ones in System.Xml.

 

The problem is, during business hours the trackers on SourceForge are so slow as to be nearly useless. I assume this is because they're overloaded. Whatever the reason, if all you want to do is scan through all the open bugs in FlexWiki, it's annoying: I'm not going to update anything, after all, and looking through 200 bugs when page loads take in excess of 30 seconds each is not an option.

 

So I set out to see what I could hack together quickly to fix the SourceForge export. And what I came up with is tracker-tidy. The source is at the bottom of this post. It's an ugly little program that does one thing: escapes & and < within a particular XML tag. It does this by scanning a text file one line at a time, looking for <foo>, where "foo" is whatever you specify on the command line. When it finds it, it replaces all occurrences of & and < with &amp; and &lt; (respectively) between that point and where it finds </foo>.

 

Like I said, it's ugly. It doesn't deal with attributes on the opening tag, it doesn't know jack about XML namespaces, and it doesn't replace the contents of more than one tag at a time (although it will do all tags with that name). Doing all that would have required a lot more than 70 lines of code. But what I have works against the SourceForge feed, although I have to run it several times, once for each tag that might have illegal content. I use wget and tracker-tidy together in this batch file:

 

wget http://sourceforge.net/export/sf_tracker_export.php?group_id=113273^&atid=665396 -O bugs.xml
copy /y bugs.xml input.xml
tracker-tidy summary input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy detail input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy old_value input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy text input.xml > intermediate.xml
copy /y intermediate.xml bugs-tidied.xml

 

wget http://sourceforge.net/export/sf_tracker_export.php?group_id=113273^&atid=665399 -O rfe.xml
copy /y rfe.xml input.xml
tracker-tidy summary input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy detail input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy old_value input.xml > intermediate.xml
copy /y intermediate.xml input.xml
tracker-tidy text input.xml > intermediate.xml
copy /y intermediate.xml rfe-tidied.xml

 

del bugs.xml
del rfe.xml
del input.xml
del intermediate.xml


 

This simply downloads the tracker bug and feature request exports for FlexWiki and tidies up the output with multiple passes of tracker-tidy (one for each problem tag in the export).

 

Once I have the tidied files bugs-tidied and rfe-tidied, it's a simple matter to open them up in Excel or InfoPath, or whatever your favorite XML-based tool is. From there, sorting and searching is dead easy. If I get really ambitious, one of these days I'll write an XSL to generate a nice little web report instead. But given how little effort went into getting something that works as well as this does, I sort of doubt it.

 

Anyway, hopefully this will help someone else. Obviously, you'll need to replace the URLs in the batch file with the group_id and atid for your project, but other than that it should work fine. Just remember to escape any & signs in the URL with ^, the command-line escape character.

 

using System;
using System.IO;
using System.Text;

 

namespace Wangdera
{
  public class App
  {
    public static void Main(string[] args)
    {
      string tag = args[0];
      FileStream inputStream = new FileStream(args[1], FileMode.Open, FileAccess.Read, FileShare.None);
      StreamReader inputReader = new StreamReader(inputStream);

 

      string startTag = string.Format("<{0}>", tag);
      string endTag = string.Format("</{0}>", tag);

 

      string line;
      bool inTag = false;
      while ((line = inputReader.ReadLine()) != null)
      {
        int tagStart = line.IndexOf(startTag);
        int tagEng = line.IndexOf(endTag);
        int escapeStart = 0;
        int escapeEnd = line.Length;
     
        if (tagStart != -1)
        {
          inTag = true;
          escapeStart = tagStart + startTag.Length;
        }
     
        if (tagEng != -1)
        {
          escapeEnd = tagEng;
        }
     
        if (!inTag)
        {
          Console.WriteLine(line);
        }
        else
        {
          string beginning = line.Substring(0, escapeStart);
          string middle = line.Substring(escapeStart, escapeEnd - escapeStart);
          string end = line.Substring(escapeEnd, line.Length - escapeEnd);
          Console.Write(beginning);
          Console.Write(Escape(middle));
          Console.Write(end);
        }

 

        if (tagEng != -1)
        {
          inTag = false;
        }

 

      }
    } 
 
    public static string Escape(string input)
    {
      StringBuilder builder = new StringBuilder(input);
      builder.Replace("&", "&amp;");
      builder.Replace("<", "&lt;");
      return builder.ToString();
    }

 

  }
}

Tuesday, August 16, 2005

Dropload.com

A friend of mine showed me dropload.com today. Despite the possible scatological interpretations of the name, it's actually a nice little site. Very simply, you can use it to transfer files to a friend by uploading stuff to dropload via a web page and providing your friend's email address. They'll get an email with a link that lets them (and only them) download the file within seven days. They're only allowed to download exactly once.


Upload requires free registration, but download does not. And if you believe their privacy statement (I do) they won't share emails with anyone. Poking around the FAQ, the site seems legitimate enough.


I'll definitely be using this site again.

Wednesday, August 10, 2005

Ian on Projections

Ian Griffith's blog is consistently worth reading - I learn something almost every time he writes. I want to particularly note his latest post, though, because it has a direct bearing on one of my favorite subjects: Direct3D. In the post, he talks about the difference between orthographic and perspective transforms, a topic which is highly relevant to anyone who has ever wondered about the APIs Matrix.PerspectiveFovLH, Matrix.OrthoLH, and all their friends. Go read it.

Monday, August 8, 2005

Talk About Yer Security Vulnerabilities

Bruce O'Dell is a friend of mine from back in my Digital Agility days. He's been very invovled in USCountVotes, so he sent me this link. If you follow it, you can find details of the vulnerabilities of the Diebold optical scan equipment increasingly used to count votes here in the US. Regardless of whether or not you think vote tampering has taken place, it certainly seems to me that election machinery ought to be a bit more tamperproof than your briefcase lock...paper trail anyone? After all, ATMs have them - it's not like the technology is way out there or anything.

Thursday, July 28, 2005

How Very 1998 of Them

If you've read my blog for a while, you know that I've long since left GotDotNet for SourceForge. Generally, I've been more happy there, particularly because I far prefer CVS to the crazy contraption GDN has, but not all is rosy. To start with, I find that during the middle of the day, SF has lately been slow to the point of unusability. This is particularly painful when I'm trying to use the bug and feature tracking pages, which require several clicks to get what I'm looking for. Each click can take as long as 30 seconds.


At times I've thought that maybe I should just set up my own bug database. But then I'd have to evaluate the options, pick one, and then maintain it. Bleh. Still, it might be better than having to wait half a minute for each page to come back. Then, the other day, I stumbled across this, a feature of SF that lets you get the entire contents of a tracker in “XML” form. (More about the quotes in a second.)


Brilliant! With this, I figured I could suck the bug and feature databases onto my machine, then use Excel or InfoPath (if I ever figure out InfoPath) or whatever to browse the database quickly. I can't do updates, but often times all I want to do is read through a bug description or do a quick search, and a local export is perfect for that. So I sat down to automate download, got done with that in a few seconds (gotta love wget), and then went to load up the FlexWiki bug database in Excel.


XML Parse Error.


WTF? I ran it through one of the XML utilities I wrote, and found out that there was an unescaped “smart quote” in the text. Disturbing that it wound up in the export, but oh well - three seconds with “find and replace” fixed that. Fire it up in Excel again and...


XML Parse Error.


OK, now what? Well, it turns out that the smart quote issue was just the tip of the iceberg. Here's an exerpt from the exported file that shows the problem in all its glaring obviousness:


<detail>Produces this output (first </ol> not needed, second
opening of <ol> not needed).  If that was fixed then the
incorrect double <ul> should also be fixed (single <ul>)
as they would be no longer necessary. BTW:  also seen
here the unnecessary blank lines
</detail>


Note the numerous problems with this. Here it's unescaped HTML tags being produced verbatim, but unescaped & and < characters (not to mention the smart quotes) litter the export. This is a classic sign of XML being built via string concatenation, and it renders this export virtually useless for consumption - all the dangling close and open tags completely destroy the ability of any reasonable XML parser to work with this document. I would have thought this sort of thing went out with go-go boots, but here it is, right in my face, in 2005.


Sigh. It's a fairly simple fix to process the document with custom code to escape the “XML” well enough to be able to work with it, but I shouldn't have to. The moral of the story here is that if you find yourself doing something like this:


xml = “<foo>” + fooContents + “</foo>”;


then you should lose points on your programming license.

Thursday, July 21, 2005

Beam Him Up

James Doohan, the actor who played Scotty on the original Star Trek series, died Wednesday of complications related to pneumonia. He was 85. I think it's safe to say that the chief engineer on the Enterprise was an iconic figure to a pretty good portion of our profession. I know for my part I quite often employ his trick of multiplying all my estimates by four. It's funny, but even when you tell people this is what you're doing, you still look good when you finish in half the time you said it would take you.


Hats off to Mr. Doohan for helping to create such an unforgettable character.

Wednesday, July 6, 2005

Convergence

As part of my undergraduate education, I spent a fair amount of time programming in Scheme, which is a dialect of LISP. At the time, I didn't really appreciate how truly powerful LISP is, but in the intervening years, as I've seen languages like C# and Java grow up, I've made or heard the comment “Yeah, LISP already has that” enough times to make me consider trying to program exclusively in LISP for a while. As was the case with my self-imposed switch to Test Driven Development, forcing myself to use a new technology is guaranteed to be a great way to learn something. The kicker, of course, is that I'd really miss the .NET libraries - at this stage of my career writing your own file IO routines is only fun for about ten minutes, and learning new ones is only slightly more fun.


Not that that's going to stop me from learning Ruby1. You've probably heard the buzz around it lately - I certainly have. Buzz I can generally ignore, but when smart people like Brad get excited about something, I generally try to get around to looking into it sooner or later. Particularly when Scott raves about Watir, and I need something that looks a lot like that for several projects I'm working on. So yesterday I started to work through a pretty good Ruby tutorial on my way to playing with Watir, and I thought to myself, “Man, this looks a lot like LISP's child by a shell script mother.” Not the syntax - there Ruby and LISP are pretty different - but in a qualitative way that I have a hard time pinning down...they just “feel“ similar in how they deal with programming abstractions at a high level. I don't know: it's hard for me to express given how little Ruby I've done and how little LISP I've done lately.


Anyway, this morning I'm flipping through my aggregator entries, and I see this post from Pinku Surana about how it would be relatively easy for someone to knock out a LISP.NET implementation. Combine this with the fact that I'm basically between gigs right now, and I think the planets are aligning - it seems I'm being directed to learn a higher-level language. I just need some flaming letters on the mountainside to be sure. :)






1. Although it looks like there's a Ruby/.NET bridge - have to check that out at some point.