« September 2004 November 2004 »
blog header image
# You Can't Be an Expert at Everything

It's pretty easy to put myself in the shoes of people that are mystified by computers -- all I have to do it think about cars. It's not that cars are extremely complicated machines, it's just that I am ignorant of car construction simply because I'm not that interested in learning about it.

It's this same disinterest that people have with computer hardware and software construction. It takes time they'd rather be spending on other things. These people want to be able to get from point A to B -- just like I do in my car -- without having to learn about stuff they aren't interested in.

I do think I should learn more about cars though, especially because I'm driving a 13 year-old used vehicle that (of course) will have its share of worn out parts. I'm trying to learn about front driveaxles, suspension and ball joints because that's what's broken at the moment. The Haynes repair manual for my car has helped a lot.

Maybe I'll be able to get away with slowly learning this stuff but it's nice to have some relatives and friends that are knowledgable about cars so that I don't have to be.

It goes to show that it's good to know experts in different fields: technology, automotive, real estate, law, finance, business, engineering, etc. You don't need to know all of the details of the other field because you trust the expert's judgement. That saves you time just as you do for them by giving your expertise. It's wonderfully symbiotic, as long as you're knowledgable about something and are able to help others.

posted at October 30, 2004 at 04:23 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (12)

# Jon Stewart on C-SPAN

Jon Stewart, who hosts The Daily Show on Comedy Central, was widely reported on his comments on CNN's Crossfire recently. He criticised the media, saying they were "hurting America" and even called the show's hosts "hacks".

The day before the Crossfire appearance, October 14th, he had an hour-long interview on C-SPAN, where he goes deeper into his criticisms of the media in front of members of the media. Not only is it funny, I think he brings up some good points about media responsibility.

Since C-SPAN is a non-profit organisation, I doubt they'd mind if I tell you that the interview is available on BitTorrent. Go check it out.

posted at October 30, 2004 at 01:35 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (1)

# Bell Mobility Billing Problems Update

NOTE: Here's the latest on this issue.

I'm getting a fair number of comments on my recent post about Bell Mobility's billing problems. I thought I would give a brief update on the situation.

After being double charged, I was credited by Bell for the month I was overcharged. Then just recently I received a cheque from Bell Mobility for an amount around one month's worth of service. While I don't mind random cheques, now I'm really confused. No letter of explanation or status of my account was attached to the cheque.

I'm going to have to gather all of my Bell Mobility bills from the past year, match them up to my VISA bills and see where I stand. This is quite a pain, but it is not the same pain as paying too much. So things are better, but not ideal. I'm not having money problems, but I can imagine that an overchange like this might hurt some families or people on lower fixed incomes/pensions.

Anyway, I think Bell Mobility really has some customer communication issues. I didn't find out about the billing problems until I tried to contact them. A month later my account was credited but there was no admission that a problem occurred. And now I'm getting a mysterious cheque with no explanation.

Bell, we're buds. Just tell me what's going on.

posted at October 29, 2004 at 12:33 AM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (32)

# Podcast This

The newest blogging fad lately is podcasting, a pun of "iPod" and "broadcasting".

Content providers are like radio hosts, recording audio for their listeners. Except instead of broadcasting it live over the Internet, it's distributed via an RSS feed that supports an email attachment-like feature called an enclosure.

These feeds can then be downloaded easily to an iPod using some software on your PC. You can also just listen to the files on your PC too, but one of the sell points is that you can listen to these audio shows when you're on the move ... like in the car, bus, subway, etc.

The same old content-production-is-masterbation arguments are coming up again, just like they did with blogging. I think they have new validity with this new format, only because people are having a problem getting to the point when they speak. I can't skip whole paragraphs of audio. I can't skip music tracks. These audio files have to be split up a bit and indexed properly to serve the users. Indexes could be id3v2 tags if they were supported by the players.

The coolness factor also doesn't compensate for the fact that most people just don't sound good. Writing takes training and so does audio broadcasting. Podcasters should put the same production effort into their podcasts that they do editing their written words if they expect people to listen to them.

All that aside, I think that Rory Blyth has some great points about how this is yet another free/decentralized communication medium. I could see musicians using it very successfully to get their stuff noticed. Lucky for you, dear reader, I won't be doing a podcast any time soon.

posted at October 25, 2004 at 02:17 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

# Project Quality as a Development Bottleneck

What's the bottleneck for software development? More directly, what's the area that is important but takes the a lot of time and if it was sped up would decrease total development time? I'd have to say it's project quality.

If project quality is in check, the project is in a better position to release at any given time. Shipping the software is a primary goal of software development. Even if you aren't following XP's iterative ship early, ship often approach, ask yourself "is this task getting me closer to shipping the product?" If it's not, question why you're even doing it.

Another reason project quality is important is that it's much easier to refactor and add features to well tested code because you'll know if you break anything. This puts you in a position to augment with confidence.

Without unit testing, you are adding code entropy without a safety net every time you touch the code: you are stirring up the codebase and adding defects. I don't care how good of a programmer you are, you will add defects when you add features. It may seem like it's much faster to add features this way, but it's a trap. In the long-haul the develop now, test later approach takes much longer to add features and make sure they are good enough quality to ship. You're going to have to do the quality work anyway, why not do it now?

So let's see ... how can we improve software quality? Unit testing is a good way, and if you go even further you can do test driven development. Developers sometimes resist this approach because they see testing as grunt work. This is old school thinking, and developers like this won't help your project -- they will hurt it. They are not thinking of the project, they are thinking of themselves.

Another way to improve quality is to throw people at the problem. This may seem like a stupid idea, but this is why open source projects succeed: the project managers leverage everyone that uses the software. You have to channel this feedback properly, organize it and get the feedback to the people that can make the changes but it seems to work really well. Using feedback is absolutely critical for success.

In fact it works so well in the open source world that it's seen as a viable replacement for unit testing on small projects. I don't completely agree with that sentiment, but having an army of free testers is better than nothing. The interesting side effect is that if you're not unit testing an open source project you can get new (but initially lower quality) features into the codebase faster, attracting more users on the bleeding edge and then the project sometimes just snowballs. Quality is improved by listening to feedback. You walk a fine quality line, but you have to in order to respond to feedback quickly and attract more users, which improves quality. It's very cyclical.

Taking in and using the feedback isn't the hard part. The hard part is finding the people. On open source projects it seems to be a combination of the project's usefulness and the project manager's charisma and communication skills. On closed source projects users are much harder to find.

Number one, they usually cost money. Number two, you can't just use anybody to test your product because it might be a secret, so you need to test in house. Number three, your developers are biased and are poor testers by nature -- they can only do so much. All of those factors narrow your options. Closed source projects can't compete with open source projects' army of free testers, so they need to improve quality in other ways.

Planning a project up front (ie. waterfall) is a defensive style that can improve project quality at the expense of agility. If your company can live with this, then it's a good way to go. Iterative/agile approaches also seem to produce good results. They use customer feedback with iterations and unit testing to allow for high quality but agile code that can respond to change ... at least that's the theory.

So back to the bottleneck issue now: how can we speed up project quality? I think there's lots of opportunity to improve testing tools, even just improving how we look at the tests themselves so that their effectiveness may be analysed easily. Continuous integration also helps to keep the project on track, and gives the project manager good feedback.

Code coverage tools like EMMA also help the testing effort, pointing out weak areas and making sure important/sensitive areas are well covered.

Couple that with more effective feedback management and defect triage and I think you'd speed up the whole development process a lot.

Someone is going to make a lot of money improving the testing experience. There's a big gap here and lots of opportunity.

posted at October 19, 2004 at 08:13 AM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (5)

# Bray to Americans: Fire Bush, Please

Tim Bray puts in words what a lot of Canadians and people worldwide are thinking: Americans should fire their CEO. Unfortunately what the world thinks may not affect Americans that much. And why should it? The USA is the only elephant left in a crowd of ants, to use the same analogy Roy used. All of the ants just wait to see where the elephant sits for the next 4 years and try not to get crushed.

Here's a choice quote anyway:

"Let’s completely ignore the subject of whether they’re right or not; maybe Dubya is an enlightened, straight-arrow kind of guy who is just misunderstood. But consider the consequences. If you’re running a company and there’s a general perception that your CEO is an asshole, eventually it won’t matter that much whether he really is or isn’t; the perception will become an obstacle. And right now, the United States of America is facing that obstacle."

More: Also linked by Bray is Russell Beattie telling people to lean into it. I support Kerry too, and not because he's not George W. Bush. Ad hominem attacks are weak. The reason is simple: Republicans are less liberal than Democrats.

I don't see much of a personal difference between the candidates that's unrelated to their party affiliations. They are both wealthy out-of-touch white males. They are both exaggerating(/lying/half-truth telling) politicians desperate to be in the White House for four years. So think about which party's policies you agree with and go with that candidate. Even if it's Bush I'll be happier knowing you did it because it aligns with you, not because Kerry smells bad or because you think Bush is an idiot.

If one of the candidates was obviously off his rocker, that might have an impact. But that's just not the case in this election, so why make it personal? It's not a character election. It's a tight race and both candidates are probably going to try to make it personal in the next few weeks. Don't let them get away with it.

posted at October 15, 2004 at 10:30 AM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (1)

# Google Desktop Search is Good for AudioMan

There's lots of buzz about Google Desktop Search, and why not? Windows should have had a better metadata-based file search 5 years ago. Google is filling a niche *right now* but they'll be pushed out by Longhorn's integrated search if it's at least 75% as good, just like Internet Explorer did to Netscape.

Every time a searching tool makes big news, like this from Google or Apple's announcement that searching will be a big part of the next OS X (10.4, Tiger), I think about how it affects the core ideas of the AudioMan projects.

The bottom line is that searching depends on information. If the information cannot be read, then it cannot be searched. File formats have to be not only understood by the searching tool, but the information must also be filled out so that it can be searched.

This is why I wanted to shift AudioMan's focus away from being a collection browser to a collection organizer, where incomplete metadata stands out and can be filled in by a user, or queried from an Internet database. Not all of this can or should be automated, but the tool just needs to concentrate on making it as easy as possible. File browsers don't show incomplete data because it gets in the way of regular every day use. This tool wouldn't be an every day tool, so it's free of those constraints.

AudioMan also needs to bring incomplete data to the forefront, rather than hide it. Windows Explorer, WinAmp, iTunes and even Spotlight and Google Desktop search already don't care if you have incomplete metadata. Some people will need a tool that does.

More: also interesting is the fact that Desktop Search seems to be supporting file formats, which to me says plug-in style architecture. Though the search results pages seem to be customized for each type of file they support.

Maybe Google will open up the API so that third parties can write support for the multitude of file formats out there. An obvious issue is quality; you don't want some third party app messing up file formats it doesn't understand. Maybe some companies can digitally sign their metadata plugins for their own formats.

Even more: Already people are complaining that Desktop Search doesn't support the file formats they use, even in the midst of their own enthusiasm. But that's not the point. The point is that Google shipped something, Microsoft and Apple have not, and it's only the beginning.

If Google manages this one right (ie. supports a community effort since support for all file formats is beyond their means), they could start a landslide of supported file formats with very little work required by Google. Yes, I realise that Dave is maybe just trying to spur this on by criticising. I admit that sometimes that strategy works pretty well.

One thing that Google has to be careful of is standards. If they make a metadata description API to read data from files, it could become the de facto because they were first and companies aren't going to want to specify their file format more than once. Good thing Google has a history of treading lightly. They just have to keep up the mantra "Don't be evil".

posted at October 14, 2004 at 02:03 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (3)

# Interpol at Le Metropolis, Montreal

The fact that Interpol played in Montreal on a "school night" wasn't going to stop me from seeing them, and I'm sure glad I didn't. The show impressed me on many fronts.

The first was the venue itself: Le Metropolis, an old theatre (among other things) turned concert hall. With a capacity of 2300, it puts every concert hall in Ottawa to shame and the sound was great all night. There's a huge balcony on the second floor with about a half a dozen levels, each with a row of soft bar stools. The view was great up there too, but you'd probably have to arrive pretty early and stand in line outside to get one of those prime seats. The pit was the space under the opening for the balcony, and underneath the balcony the theatre went up a step every ten feet or so with a bar at the back. It didn't seem like there was a bad spot in the place for view or sound.

The first band to open for Interpol was Hail Social from Philadelphia, PA. You can go to their web site and download a few songs if you want to check em out. The way I hear it, Hail Social is in the same bass and drum heavy alt.rock subgenre as bands like Interpol and The Stills -- right up my alley. The lead singer really impressed me belting out lyrics, while the drummer and bass player had some unique parts as well.

Hail Social has a habit of releasing self-made EPs at concerts that are often referenced by color: "pink" and "blue", except I can't find a site with a discography or tracklists of these EPs. I picked up a wicked little three track EP of theirs at the concert that I guess I will dub "green" (the insert is green, but the CD is black). The first two tracks I don't know, but the third is Another Face. Apparently it's at least the fifth different self-made EP they've sold at their concerts. From the tracks I've downloaded from their website, it seems they have an eight track full-length album on the way. I can't believe this band is unsigned!

The second opening band was The Secret Machines, who I definitely wanted to see live. They had a few up-tempo songs but were mostly lower key than the other two bands. Their setup was unique: they had the drums on one side of the stage facing the keyboards on the other side with the lead guitarist in the middle. Seeing a drummer from the side really gives you an appreciation for the work he's doing, and the drummer for TSM rocked.

The songs themselves were quite different that the album versions, which is cool for a live show ... but I found that many songs sounded a bit jumbled. I don't know if that's the effect they were going for, or just the accoustics of the place. The lead guitarist relished being the the only standing member of the group and really wailed on his guitar when he didn't have to stand still enough to sing backup.

Interpol's perfomance was true to their CDs, which was definitely still good enough for me in this case. It was a completely different experience than listening to the "clean" CD versions -- the songs had more life, especially the drums which were heavier. I was curious why most of the crowd didn't move though most of the performance. Personally I can't help but from at least bobbing my head to Interpol.

All of the Interpol band members played in suits, with the exception of the drummer who was in a red t-shirt. It was a pretty interesting contrast though I'm not sure it was intentional ... practically speaking, drumming in a suit could just be too hot. :) They played a good number of songs from their debut album that the crowd was more familiar with, but most were from the recently released sophomore "Antics". After two encores it was all over until the next one.

I look forward to going back to Le Metropolis! Maybe next time I won't get so impossibly lost in Montreal on the way home. I just need a better map. :)

some related songs

Hail Social
   Track #1
   Way Out
   Get In the Car**
   Another Face

** wasn't sure of the name of this one -- it was track 2 on the green EP I bought at the concert, and it didn't have a tracklist. Here's some Google cache evidence (while it lasts) that a song by this name exists.

The Secret Machines
   Sad and Lonely
   The Road Leads Where It's Led
   The Leaves are Gone
   Nowhere Again

   Length of Love
   Obstacle 1

bonus track to check out:

Hot Hot Heat - Touch You Touch You

posted at October 14, 2004 at 01:06 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (2)

# Where are the Software Project Management Case Studies?

Software Engineering and software project management are fairly young fields when compared with more traditional business project management. However I find it interesting that there's isn't as much of an emphasis on case studies as there is in business.

The software studies I'm familiar with liked to focus on abstract ideas, research projects or cummulative/combined statistics of many different projects. This is not quite the same as seeing a software project progress and learning from the specific mistakes that the project managers made, learned from and responded to. These things are invaluable to people learning the business, and are often learned on the job. But why make these mistakes yourself when other people can make them for you?

Now don't tell me the reason that we don't have case studies in software project management is because of fierce competition or secrecy. There's just as much of that in the rest of the business world, and I don't buy it.

And I'm also not buying the excuse that technology is too hard to keep up with, which would mean that case studies in software engineering would go stale faster than business case studies. Traditional business changes just as quickly, and is accelerated by the same technology that software project managers take advantage of.

If anything the disparity in business project management is larger than in software project management; there are more people that can get away with not knowing technology or fear technology or hate technological progress. But in the long run these are the guys that will get beat by business people that keep up with the technology that can help them. For the same reasons, technology is likely to play a role in business case studies.

With a lack of good software project management case studies what is a curious software engineer to do? Well one (slightly) obvious suggestion is to look at some open source projects. Look at successful ones and definitely some failures and try to figure out what happened. See what was hard work, what was dumb luck and what were just good calculated guesses. Look at projects of different sizes and see what has to be done differently depending on the size of the project, from one man projects all the way up to huge projects like the Linux kernel and Mozilla.

This information is out there and public for anyone to get their hands on. It's a lot easier than banging your own head against a wall for the next 30 years.

posted at October 12, 2004 at 04:40 AM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (2)

# The Code is the BVA?

I could just have had my programming world turned upside down today. It's funny how that happens sometimes. I started out pretty innocently: I was reading the darcs mailing list. From there I went to investigating Haskell, which lead me to the best Paul Graham article I've read (and I've disagreed with him before, but that's OK) titled Beating the averages.

I was indeed assured by Paul's statement: "The purpose of this article is not to change anyone's mind, but to reassure people already interested in using Lisp-- people who know that Lisp is a powerful language, but worry because it isn't widely used." His anecdote about selling a successful Lisp-based company to Yahoo! also helped, naturally.

But the lightbulb went off while I was reading Why does Haskell matter?. Haskell and Lisp are both functional programming languages. I've been racking my brain over the general problem of testing for weeks, honestly ... and it seemed pretty hopeless and even worse: manual.

It got to the point that I realised that if you wanted to do really good boundary value analysis, which is a common unit testing technique, you end up specifying the function in two equivalent ways: once in the implementing code, and again in the tests that verify it. The tests work because these two different ways of saying the same thing lean on each other like two playing cards.

If the code or tests say something that disagrees, a unit test might fail [1]. The mistake could be in the code or a test, it's hard to tell if all you know is that it failed. But either way it means there's probably a bug somewhere.

It might be very rare that you get a bug that's duplicated identically in both test and code, unless you've just misunderstood the spec you're implementing. You have to remember that your tests can have bugs too and that the code might catch these errors, and it might not. The fact that the tests and the code are two different ways of looking at the same problem is very useful and essential to the leaning against each other aspect.

Back to Haskell now. When you do boundary value analysis in an imperative language like Java, you might picture a real math function to do your boundary value analysis. You figure out boundaries based on the math function, write tests for those conditions and then write Java code that makes the tests pass. The tests are based on this math function, and the code is based on if statements, loops and variable assignments. Much different, right?

In Haskell you write the math function out to make your code, quite literally. The lightbulb went off when I also realised: holy crap, this is also the BVA!. You'd just be wasting time unit testing Haskell code. Instead of code and test being two different ways of doing the same thing, you'd just be doing the same thing twice if you unit tested a Haskell function. This doesn't have the same card-leaning effect, you're just duplicating.

This might be why Haskell has no unit testing framework -- it doesn't need one. In fact, there is no HaskellUnit! This is so absolutely counterintuitive to everything I've been learning from the XP bandwagon that it's probably going to take me weeks for it to sink in completely. update there is a Haskell unit testing framework: HUnit.

Maybe it's all just a bunch of idealistic crap though. How can I know right now? There's no way. I won't know until I dive in head first. I think now's the time to learn a functional language.

Note what that post above says about acceptance testing though: it's still necessary. I wonder if Haskell suffers from the same level of integration problems (functions using functions using functions) as imperative languages.

[1] You could be missing a test after all, then everything would seem peachy. Just because all of your tests pass doesn't mean you have no bugsin your code. In other words, if you don't test for it, you won't know if it fails.

posted at October 08, 2004 at 06:08 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

# History Repeating Repeating

On the id3v2test project I'm generating unit tests from XML. Since the generation is automatic and effortless, it opens the door for interesting things.

For example, in the id3v2 specifications there are many frame types. Each frame type consists of fields where data is stored. I made my own list of these fields from the spec and put them in a clearer format.

As you can see, a lot of the same field types are used over and over, so I can save a lot of effort by only making one set of tests for each field type.

This is an acceptance test suite so I'll have to generate the field type tests every time I see that field type in a frame type. The reason is that I don't know how the id3v2 library that's being tested is implemented. I can't just assume that because a field type is handled correctly in one frame type it's handled correctly in another. I have to test every variation and permutation explicitly. But luckily for me, the XML tests and generation code lets me do that quite easily.

While this strategy might generate a very large test suite, it seems like the easiest and best way.

update: Here are the field types...


posted at October 04, 2004 at 05:07 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

# audioman dot re-org

I have reorganized the audioman.org website to include the two new projects: id3v2test and jid3rL. I've also moved the old AudioMan project to its own area, now that it's on hold.

There's more information about these projects on the site and I'm keeping it pretty plain for now. As always, comments are welcome.

posted at October 04, 2004 at 08:23 AM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

# The Stills at Capital Music Hall, Ottawa

I saw The Stills Thursday night at Capital Music Hall here in Ottawa. I'll get my bitching out of the way straight off just to be done with it: the audio was bad again at the CHM (see my account of Thornley at CMH) ... there was very audible crackling over the main speakers, especially during the opening act. Maybe I am expecting too much from a small bar, I dunno ... I haven't been to many live shows (less than a dozen). OK, I'm done. :) hehe

The Stills are a great live band -- they have lots of energy on stage and improvise a lot, especially the drummer and the bassist. It's great when you go to a concert and hear every song on a band's album done differently. It's a whole other window on the music, and it looked like they were having a lot of fun not taking themselves too seriously. If more bands were like this live I might go to more concerts!

During their last song (yesterday never tomorrows), which the drummer sings, they brought up a guy from Broken Social Scene (who if I'm not mistaken was playing with the opening act, they are label-mates) and sang some of their lyrics ("bleachin your teeth...").

The opening act was Jason Colette, who I can best describe as an amalgam of Wilco, Blue Rodeo and Pete Yorn and others. Nothing really stuck out and made me go "wow!" but it was good low-key rock with a subtle country influence.

The show was a well spent 20 bucks!

some related songs

Broken Social Scene
   Looks Just Like the Sun
   KC Accidental
   Stars and Sons
   Anthems For a Seventeen Year-Old Girl

The Stills
   Of Montreal
   Lola Stars and Stripes
   Changes are No Good
   Still in Love Song
   Killer Bees

posted at October 02, 2004 at 04:02 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

# Bell Mobility Billing on the Fritz

NOTE 2: Here's the latest on this issue.

NOTE: I have written an update to this story.


If you're a Bell Mobility cellular customer you'll want to check this month's bill carefully. They are having technical difficulties with billing.

My cell phone bill is automatically paid by VISA card, so you can imagine my surprise when I was billed $130 by Bell this month on my VISA bill. I waited a week for the phone bill to come in the mail to find out what happened. Scenarios ran through my head ... maybe I left my phone on accidentally on a call to Tangiers ... maybe my cell ID was stolen and people were making illegal calls with it!. I was definitely curious...

Turns out that their billing system thinks my credit card expired, so the "balance" from last month was carried forward. BUT I was charged for last month on my VISA card PLUS this month's balance. So I paid for last month TWICE. When I called the customer service line I got an automatic message that said things would be sorted out by November. Probably with a credit to my account, but YIKES.

The cause of the problem? A botched software upgrade. Ouch. I was thinking of switching away from Bell anyway ... this might be the last straw.

update: Bell has credited me for the extra month I paid. I still need to change my phone plan though ... not easy when the Bell support lines are jammed dealing with this stuff.

posted at October 01, 2004 at 09:53 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (45)

# Refactoring XML ... Difficult?

I was really worried about thinking ahead about my XML format for id3v2test and then I realised that it's XML and it can be transformed with XSL into just about any other XML schema.

If I had 1000 tests in XML, I could update them all fairly easily. I wouldn't have to worry about not being able to refactor them becase XSL could be my refactoring mechanism.

How much of a pain would this be, though? It might be difficult if you don't know XSL that well, but at least your acceptance tests (and metadata) aren't locked into a vendor specific binary format, or worse: a specific programming language. You have a way out.

posted at October 01, 2004 at 12:55 PM EST
last updated December 5-, 2005 at 02: 2 PM EST

»» permalink | comments (0)

Search scope: Web ryanlowe.ca