Saturday, 05 August 2006

Teaching computers what humans already know

I just watched a version of this talk by Luis von Ahn on Human Computation. I watched it on UWTV, but this one on Google Video is much easier to use online.

This is the sort of thing that will drive real gains artificial intelligence. Get your Humans United Against Robots T-Shirt today ;-)

Seriously though, check out that talk. It is fascinating.

08/05/2006 21:38:55 (Central Standard Time, UTC-06:00)  #    Trackback

 Friday, 28 July 2006

Misapplying the idea "Commoditizing Your Complements"

Dare is correct. I misapplied the idea of "Commoditizing Your Complements". I'll blame it on my wishful thinking that there would be an advantage for Google to release GFS & BigTable into the wild.

I do think there is an opportunity for Yahoo! to marginalize the advantage that GFS & BigTable give Google. I don't think it is fair to compare Hadoop to Mozilla. Mozilla was a much more speculative project. There are plenty of companies that have an interest in seeing something like Hadoop succeed. Those same companies have a real interest in helping develop it. I think it would be more accurate to compare Hadoop to Jboss, Linux, and/or Apache.

It is a mistake to think of Yahoo! selling advertising real estate. Maybe that is true for Yahoo! and MSN. But it is not true for Google. Google is building an advertising platform. I think Yahoo! and MSN are trying, much less successfully, to do the same. Right now Google's software gives them huge advantages in the Ad Platform space. But if Yahoo! and MSN can catch up then we're in for another platform war. And frankly I think the coming Ad Platform war will make the browser war look tame.

The thing that scares me the most is what happens to my data if Google starts to lose that war? I recently started using Gmail for all my email. Right now Google makes it possible, but not easy, to get my mail out. But if they start to lose ad revenue to Yahoo!, MSN, or someone else will they try to lock me in to Gmail? The same fears would apply to MSN & Yahoo! if I used their services instead.

Google seems to be re-building Hailstorm. But the trust problem that Microsoft had with Hailstorm wasn't just because they are Microsoft. Google will have the same trust problem. It just might take people longer to catch on.

I won't trust anyone with my data. For something like Hailstorm to work we need a federated storage system that is separate from the services that use the data. WinFS was one attempt to solve part of this problem. Ideally this would be a P2P system similar to GFS & BigTable. But Amazon is showing that something like S3 could work here too.

I wish I had more time to think about and work on this space.

07/28/2006 08:36:08 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 12 July 2006

HOWTO: Installing SQLite 3 on Windows for use in Ruby on Rails

Installing SQLite 3 on Windows is pretty easy, once you figure out what you need to do. But when I started using Ruby on Rails I struggled to figure out what I needed to do. Here's a short screencast that demonstrates how you can install SQLite 3 on Windows so you can use it in your Ruby on Rails applications.

A flash version will eventually be available here.

07/12/2006 18:15:20 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 11 July 2006

Do you want to play with the latest REST stuff in Rails?

At RailsConf 2006 David Heinemeier Hansson talked about the new REST features coming to Rails. If you want to play with this stuff you'll need to know where it lives.

There are two pieces you need. ActiveResource is the piece that let's you consume REST services as if they were ActiveRecord database backed models. ActiveResource is part of Rails 1.1.4. But ActiveResource has nothing to do with exposing your application's resources in a RESTful way.

To expose your application's resources you need the simply_restful plugin. David did mention simply_restful during his keynote, but if you're like me, you didn't pick up on that the first time through ;-)

I'm not sure what versions support what features. I'm using Edge Rails right now. You may have everything you need in Rails 1.1.4 + simply_restful. But you may need to bleed on the Edge to keep up with any changes that happen on the way to Rails 1.2 release.

07/11/2006 18:27:10 (Central Standard Time, UTC-06:00)  #    Trackback

 Sunday, 09 July 2006

David Heinemeier Hansson RailsConf 2006 Keynote is now online

Well it took longer than a few minutes, but DHH's keynote is now online.

07/09/2006 20:49:16 (Central Standard Time, UTC-06:00)  #    Trackback

Get started with Ruby on Rails in less than 5 minutes

I first started looking at Ruby on Rails more than a year ago. I even bought the first edition of as soon as it was released. But I didn't do much with Rails until recently. I didn't want to struggle with setting up Ruby, Rails, Apache, and MySQL. I've installed Apache and MySQL on Windows before and it wasn't much fun.

But now you don't have to worry about installing anything to try Rails. InstantRails includes everything you need to set up a fully working Ruby on Rails environment. Better yet, with InstantRails you don't have to install anything. You just unzip InstantRails to a folder and you have everything you need to try Ruby on Rails. If you decide you want to get rid of it just delete the InstantRails folder and your machine is back to normal.

I think this is such a big deal for Windows developers that I created a screencast showing how you can get a new Ruby on Rails application running on Windows in under 5 minutes.

5 minutes to Ruby on Rails nirvana

A flash version will eventually be available here.

07/09/2006 10:39:40 (Central Standard Time, UTC-06:00)  #    Trackback

 Friday, 07 July 2006

Why would Yahoo support an open source version of the Google File System?

Yahoo is supporting the development of Hadoop! Hadoop! is an open source project that is working to create a Distributed File System (think Google File System) and an implementation of MapReduce.

I find this effort by Yahoo! to be rather interesting given that platform pieces like GFS, BigTable, MapReduce and Sawzall give Google quite the edge in building mega-scale services and in Greg Linden's words are 'major force multipliers' that enable them to pump out new online services at a rapid pace. I'd expect Google's competitors to build similar systems and keep them close to their chest not give them away. I suspect that the reason Yahoo! is going this route is that they don't have enough folks to build this in-house and have thus collaborated with Hadoop project to get some help. This could potentially backfire since there is nothing stopping small or large competitors from reusing their efforts especially if it uses a traditional Open Source license. [Dare Obasanjo]

Dare is surprised that Yahoo! is working on open source versions of tools that could give them a competitive advantage. I can't tell if he thinks Yahoo! is making a mistake by doing this though. I suspect he does.

That is exactly why I don't trust Microsoft as a platform vendor. If one of Dare's ideas gives MSN a competitive advantage what do you think the odds are that we'll see that idea rolled in to the .NET Framework? I think the odds are close to 0.

There is nothing wrong with that. That doesn't make Dare a bad guy and it doesn't make Microsoft evil. But it does make Microsoft a poor choice as a platform vendor. I'm tired of waiting years to get access to Microsoft's second hand, second best ideas.

So why would Yahoo! do this? Why would they create open source versions of tools that could give them a short-term competitive advantage? I think Joel Spolsky said it best:

Smart companies try to commoditize their products' complements. [Joel Spolsky]

Yahoo! is not in the business of selling software. They sell advertising. Software is one of their biggest complements. The best way to commoditize software is to open source it.

Google could probably gain some competitive advantage by developing their own operating system. But they don't. I wonder why.

07/07/2006 13:37:22 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 06 July 2006

Rails doesn't need "Enterprise" features

Dave Thomas asked the Rails community to add "Enterprise" features during his RailsConf 2006 keynote.

I had been disheartened earlier in the day by Dave Thomas's talk (no offense, Dave!) Dave's a great guy and has been great to the Ruby community. And although I agreed with some of what Dave said, I couldn't have disagreed more with his view on changing Rails to play better in legacy environments. I sure was relieved to hear DHH's talk the following night. But I still recommend that you view his keynote, too: see what you think and let us know your thoughts. [Softies On Rails]

I agree with Jeff from Softies On Rails. I respect Dave Thomas a lot, but I did not care for the message he brought to RailsConf. At times his keynote was quite condescending. I don't think Dave meant it that way, but all his talk of "in the real world" probably wasn't received well by the RailsConf audience.

I work in the "real world" that Dave was talking about. It sucks! I can't wait to get out. Rails is a breath of fresh air precisely because it doesn't target the "enterprise". It was built by an agile team to create new agile web applications. If you need to create a new agile web application then Rails is a perfect match. But if you need to create yet another big upfront designed enterprise monstrosity, Rails is probably not going to work for you.

Dave also talked about improving the deployment of Rails applications. This is more applicable to the general Rails community. But I disagree with his idea that developers shouldn't be worried about how the application is going to run. That is a mistake. It sounds good in theory, but it ends up creating a situation where the developers make decisions that make the system almost impossible to maintain in production. For small teams you are much better off requiring the developers to own the entire system. If possible they should be responsible for testing, customer support, operations, design and development. As soon as you relieve them of responsibility in any of these areas you can guarantee they are going to make decisions that make it more difficult to support the application in that area.

I'm glad that the core team is focused on solving their problems not some enterprise's problems. Because their problems are my problems. It is strange to work with a platform that is built by people who actually use the platform to build real applications. I've spent so many years depending on platforms that Microsoft creates but doesn't actually use that I didn't realize how much I was missing.

Like Jeff said, watch Dave's keynote and let us know what you think. I am looking forward to seeing DHH's keynote.

07/06/2006 20:51:51 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 05 July 2006

David Heinemeier Hansson's RailsConf 2006 Keynote should be available any minute now

I almost missed this. The Keynotes from RailsConf 2006 are being published on ScribeMedia's site. DHH's Keynote was scheduled for release today so it should be available any time now. In the mean time you can see Dave Thomas' and Martin Fowler's keynotes.

07/05/2006 21:42:11 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 21 June 2006

Dates loaded from the database are not valid when using validates_date_time with us_date_format

I spent quite a while trying to track down a problem in the Rails validates_date_time plugin. In the end the fix was simple, but since I'm new to Rails and Ruby I assumed I was doing something wrong. I finally decided to dig into the code and tests for validates_date_time to see if I could find a bug.

The Problem: I have a model called Person. The schema looks like so:


ActiveRecord::Schema.define(:version => 1) do

create_table "people", :force => true do |t|
        t.column "name", :string
        t.column "date_of_birth", :datetime
end

end

I set ActiveRecord::Validations::DateTime.us_date_format = true.

I can create a new Person just fine like so:


ruby script/console
>> require 'pp'
=> true
>> p = Person.create(:name => "Test", :date_of_birth => Date.new(1972, 12, 31))
>> p.valid?
=> true

Great. But when I try to load the person from the database there is a problem.


>> p = Person.find_by_name("Test")
=> #"Test", "id"=>"2", "date_of_birth"=>"
1972-12-31"}>
>> p.valid?
=> false
>> p.errors.on(:date_of_birth)
=> "is an invalid date"

What happens if I set ActiveRecord::Validations::DateTime.us_date_format = false? Let's see.


>> ActiveRecord::Validations::DateTime.us_date_format = false
=> false
>> p = Person.find_by_name("Test")
=> #"Test", "id"=>"2", "date_of_birth"=>"
1972-12-31"}>
>> p.valid?
=> true

The problem seems to be related to the use of us_date_format. So what is going on?

The validate_date_time plugin looks at each attribute that is passed to validates_date and looks at the *_before_type_cast version of the attribute. In our case it is looking at p.date_of_birth_before_type_cast. Here's what it sees:


>> ActiveRecord::Validations::DateTime.us_date_format = true
=> true
>> p = Person.find_by_name("Test")
=> #"Test", "id"=>"2", "date_of_birth"=>"
1972-12-31"}>
>> p.date_of_birth_before_type_cast
=> "1972-12-31"

So our raw value of our date_of_birth is "1972-12-31". That looks perfectly reasonable since it is an ISO date and ISO dates are the best way to represent dates since they are easily parsed. So why is that date considered invalid?

The validate_date_time plugin uses a parse_date method to parse date values. Here's what it looks like:


def parse_date(value)
        raise if value.blank?
        return value if value.is_a?(Date)
        return value.to_date if value.is_a?(Time)
        raise unless value.is_a?(String)
        
        year, month, day = case value.strip
                # 22/1/06 or 22\1\06
                when /^(\d{1,2})[\\\/\.:-](\d{1,2})[\\\/\.:-](\d{2}|\d{4})$/ then [$3, $2, $1]
                # 22 Feb 06 or 1 jun 2001
                when /^(\d{1,2}) (\w{3,9}) (\d{2}|\d{4})$/ then [$3, $2, $1]
                # July 1 2005
                when /^(\w{3,9} (\d{1,2}) (\d{2}|\d{4}))$/ then [$3, $1, $2]
                # 2006-01-01
                when /^(\d{4})-(\d{2})-(\d{2})$/ then [$1, $2, $3]
                # Not a valid date string
                else raise
        end
        
        month, day = day, month if ActiveRecord::Validations::DateTime.us_date_format
        
        Date.new(unambiguous_year(year), month_index(month), day.to_i)
rescue
        raise DateParseError
end

The last when statement is where our ISO date of "1972-12-31" matches. At that point year = "1972", month = "12", and day = "31". But then if us_date_format is true, the value of month and day get swapped. Now year = "1972", month = "31", and day = "12".

But that can't be right. The unit tests for validates_date_time all run clean and the date_test.rb has test cases for us_date_format. So what gives?

Here are the tests to prove it:


def test_us_date_format
        with_us_date_format do
                {'1/31/06'  => '2006-01-31', '2\28\01'  => '2001-02-28',
                '10/10/80' => '1980-10-10', '7\4\1960' => '1960-07-04'}.each do |value, result|
                        assert_update_and_equal result, :date_of_birth => value
                end
        end
end

Running the Plugin tests (after configuring a database for the validates_date_time plugin to use) results in this:


c:> rake test:plugins
Started
.....................
Finished in 0.703 seconds.

21 tests, 121 assertions, 0 failures, 0 errors

The problem is that the test_us_date_format is not testing a date like ours. Let's change it to look like this:


def test_us_date_format
        with_us_date_format do
                {'1/31/06'  => '2006-01-31', '2\28\01'  => '2001-02-28',
                '10/10/80' => '1980-10-10', '7\4\1960' => '1960-07-04',
                '1972-12-31' => '1972-12-31'}.each do |value, result|
                        assert_update_and_equal result, :date_of_birth => value
                end
        end
end

That says, when use_date_format is true I expect to receive the same value when I update a date_of_birth field with an ISO formatted date of '1972-12-31'. What happens when we run the tests now:


c:> rake test:plugins
Started
........F..F.........
Finished in 0.75 seconds.

1) Failure:
test_us_date_format(DateTest)
        [./vendor/plugins/validates_date_time/test/abstract_unit.rb:44:in `assert_up
date_and_equal'
        ./vendor/plugins/validates_date_time/test/date_test.rb:70:in `test_us_date_
format'
        ./vendor/plugins/validates_date_time/test/date_test.rb:69:in `test_us_date_
format'
        ./vendor/plugins/validates_date_time/test/abstract_unit.rb:65:in `with_us_d
ate_format'
        ./vendor/plugins/validates_date_time/test/date_test.rb:66:in `test_us_date_
format']:
{:date_of_birth=>"1972-12-31"} should be valid.
 is not true.

2) Failure:
test_various_formats(DateTimeTest)
        [./vendor/plugins/validates_date_time/test/abstract_unit.rb:50:in `assert_up
date_and_match'
        ./vendor/plugins/validates_date_time/test/date_time_test.rb:12:in `test_var
ious_formats'
        ./vendor/plugins/validates_date_time/test/date_time_test.rb:11:in `test_var
ious_formats']:
<"Tue Jan 03 19:00:00 Central Standard Time 2006"> expected to be =~
.

21 tests, 114 assertions, 2 failures, 0 errors
rake aborted!
Command failed with status (1): [c:/ruby/bin/ruby -Ilib;test "c:/ruby/lib/r...]

(See full trace by running task with --trace)

The first failure is the one we're interested in. It demonstrates the problem we're seeing. Let's see if we can fix it.

Replace the parse_date method in validates_date_time.rb with this:


def parse_date(value)
        raise if value.blank?
        return value if value.is_a?(Date)
        return value.to_date if value.is_a?(Time)
        raise unless value.is_a?(String)
        
        year, month, day, is_iso = case value.strip
                # 22/1/06 or 22\1\06
                when /^(\d{1,2})[\\\/\.:-](\d{1,2})[\\\/\.:-](\d{2}|\d{4})$/ then [$3, $2, $1]
                # 22 Feb 06 or 1 jun 2001
                when /^(\d{1,2}) (\w{3,9}) (\d{2}|\d{4})$/ then [$3, $2, $1]
                # July 1 2005
                when /^(\w{3,9} (\d{1,2}) (\d{2}|\d{4}))$/ then [$3, $1, $2]
                # 2006-01-01
                when /^(\d{4})-(\d{2})-(\d{2})$/ then [$1, $2, $3, true]
                # Not a valid date string
                else raise
        end
        
        month, day = day, month if !is_iso && ActiveRecord::Validations::DateTime.us_date_format
        
        Date.new(unambiguous_year(year), month_index(month), day.to_i)
rescue
        raise DateParseError
end

Now we're setting a local variable is_iso to true when our date matches the ISO formatted when statement. Now when is_iso is true we can skip the swapping of the month and day values. Let's run some tests and see if this is working:


c:> rake test:plugins
Started
.....................
Finished in 0.688 seconds.

21 tests, 123 assertions, 0 failures, 0 errors

Perfect. Now everything should be working in the console too. Let's check:


>> ActiveRecord::Validations::DateTime.us_date_format = true
=> true
>> p = Person.find_by_name("Test")
=> #"Test", "id"=>"2", "date_of_birth"=>"
1972-12-31"}>
>> p.valid?
=> true

Looks good. I think that should do it. I wonder if this explains why the validates_date_time plugin is only rated 3 out of 5 stars here. It is a great plugin, it just looks like the us_date_format might be a little rough around the edges.

06/21/2006 09:28:16 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 20 June 2006

Using In Memory Database for RubyCLR Tests

Even better than just using a SQLite database like I talked about yesterday, I'd love to use a SQLite 3 in memory database for the RubyCLR tests.

06/20/2006 08:53:46 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 19 June 2006

RubyCLR Drop 4 is Ready

John has released RubyCLR Drop 4. It is fully hosted on RubyForge now. That is great because the other day I had to use Subversion to get the latest version when I wasn't able to connect to John's site for some reason.

I haven't spent a lot of time with RubyCLR yet, but I have a couple of changes I'd like to make if I have the time. I'm not complaining mind you, John is doing a great job making progress on RubyCLR. Consider this my TODO list of things I'd like to contribute if I have time. And if the Lazy Web beats me to these, I guess that'd be ok too ;-)

  1. I wish the tests didn't depend on SQL Server Express. I'd love to have the option to connect to SQLite file instead. And since neither SQL Server nor SQLite connectivity ships with the One Click Installer it isn't any easier to get one working over the other.
  2. Unless I missed it, there is no schema for the test database included in the source repository. Personally I'd like to see a Schema.rb using ActiveRecord::Schema.define. That way even if the default was to use SQL Server Express I could quickly get rolling with my preferred database engine.
  3. Speaking of making it easier to switch database engines. It be nice if the database connection settings were pulled out into a single YAML file.

Keep up the great work John!

06/19/2006 19:17:45 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 24 May 2006

Can Ruby on Rails migrations be used to help version public Xml APIs?

When I first watched the Ruby on Rails migration screencast the first thing I thought was, wow, that is exactly what I've dreamed of having to solve the problem of versioning an Xml API.

Ok I lie, the first thing I thought was, holy crap, you mean I wouldn't have to maintain a bunch of one way SQL scripts for migrating a database.

I have no clue what it would take to get Migrations working over an Xml document format(s). Maybe ActiveRecord could be made to work over Xml, or maybe the ideas behind Migrations could be applied some other way. Eventually I want to dig into that. But maybe the lazyweb will beat me to it.

05/24/2006 08:18:30 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 23 May 2006

Top 13 reasons NOT to consider the Microsoft platform for Web 2.0 development

Some background: My frustration with Microsoft has grown over the last several years. Now that I have my own web project my perspective has changed drastically. I am still an ASP.NET developer in my day job, and that probably won't change in the immediate future, but I am now looking for alternatives to Microsoft's platform for my personal projects. From what I've seen so far Ruby on Rails looks very good.

I saw this and it really helped focus some of my recent thinking about Microsoft & their web development platform. I thought it'd be interesting to compare the point of view of a LAMP developer to my point of view, someone who has considered himself a Microsoft developer for 13 years or more.

1. Free Developer Tools

Visual Studio 2005 is great. I'm glad they decided to make it free forever (at least for the lifecycle of VS 2005, who knows what stupid ass decision the Microsoft marketing boobs will make for Orcas)

2. Free Database

You get what you pay for with SQL Express. Just installing it is a nightmare. The upgrade instructions here should be warning enough. I'm still waiting for a clean XCopyable embedded database engine for SQL Server MDF files. I suspect hell will freeze over before I get one.

Meanwhile the idea that you'll be able to move from SQL Express -> SQL Server when you need to scale sounds good. But there are very real limits to how high you can scale a database system vertically. The problem with believing that you can scale from SQL Express to SQL Server and that SQL Server will meet all your future needs is that it lulls you in to baking SQL Server specific features into your system. You'd be better off starting with SQLite knowing that your database is going to change when you need to scale.

3. Microsoft Atlas makes AJAX easier

Atlas looks promising. It is what they should've shipped with ASP.NET 1.0. The fact that they've made so much progress so quickly with Atlas just reinforces my feeling that they deliberately short-changed the client-side features of ASP.NET 1.0.

I can't help but feel that Microsoft is just waiting for WPF to take off so they can let ASP.NET die. I was shocked to see how much work they did in ASP.NET 2.0. It looks like ASP.NET still has a few years left in it. But don't believe for a minute that Microsoft won't abort ASP.NET the minute WPF starts to gain traction. In the end it still comes down to selling Windows & Office licenses. Unless Microsoft can replace that revenue from its online businesses they shouldn't be trusted as a long term vendor of Web Platforms.

4. Microsoft doesn't HAVE to get ALL of your business.

No but they still want it. Don't ever forget that.

Maybe Atlas is Open Source, I haven't paid enough attention to the license...but I kind of doubt they'll be taking patches from the community.

While Atlas doesn't force you to use any other Microsoft Technology that doesn't mean they won't move it in that direction in the future. I have this sneaking suspicion that Atlas is a stop-gap until they can get WPF/E widely deployed. As soon as that happens I think you'll start to see the switch of this particular bait and switch strategy. Don't get me wrong. I hope something like WPF/E gains traction in the years to come. I just don't trust Microsoft to deliver it in a way that I can build a business on.

5. Microsoft solutions can scale.

Duh! I laugh every time I see this SUN/Oracle propaganda regurgitated. But guess what, LAMP scales too. Just ask Google & Amazon.

I recently watched the MySpace session from Mix06. For me it just reinforced the idea that you won't be able to scale the database vertically. Again, I already knew this from personal experience. But my experience is limited to the challenges that MySpace faced when they went from a single database server to multiple database servers. It was great to hear how they had to evolve their backend as they grew.

I imagine most of MySpace's licensing costs are in SQL Server licenses. I wonder how much their code was coupled to SQL Server and how much that influenced their decision to stay with SQL Server. Their existing teams already knew SQL Server and making a switch to some other system would've caused a lot of pain all the way around. But if you don't have to face rewriting code that is working and tested then the pain of retraining employees may not be as bad. Just one more reason why you should make your code database agnostic from the beginning. I used to laugh when I heard people say that. Now I take it very seriously. And I wonder how much this played into Microsoft's decision to go from the abstract database agnostic world view in ADO to the provider specific model in ADO.NET. Maybe it was just a coincidence, but maybe not. The fact that I wonder about such things reinforces how little I really trust Microsoft with my platform decisions.

6. Microsoft pricing is flexible.

Software Assurance. Burn me once...'Nuff said.

7. Ray Ozzie

Can't argue with this. I love Ray. I hope Ray lasts for a very long time at Microsoft. But if he leaves in the next year or so that will be very telling.

8. Robert Scoble

The recent openness of Microsoft is a double-edged sword. It used to be easy to believe that Microsoft was living in a reality distortion field where they all believed the same crap that the corporate marketing machine spews. Now I see that these guys and gals are smart, and they understand a lot of the issues that independent developers face. Yet Microsoft as a company still makes decisions that only serve their self interest. That is a reasonable thing to do for a corporation, but it treats the technology industry as a zero sum game. Google on the other hand seems to understand that everyone can win. They can earn money while helping partners earn money while helping advertisers earn money while helping users get value. I'm more likely to trust platforms that Google builds because our goals are aligned. Microsoft's goals aren't aligned with mine. The bigger my business gets, the more money Microsoft gets to extract from my business in licensing fees without providing any additional value to my business. I don't see this as evil vs. non-evil. I see it as evolution. And right now Google has a tremendous Darwinian advantage. It isn't too late for Microsoft but that time is quickly approaching.

9. Being based on the Microsoft platform doesn’t limit your acquisition options.

Stop worrying about exit strategies and build a good business. That is the sort of thinking that caused the first bubble.

10. Microsoft wants to be a part of the community

They definitely want the advantages of community. I just don't know if they get that real open source communities are ecosystems. I used to think that companies like Google & Amazon are just exploiting the labor of poor naive open source developers. I now see that most open source contributors do so because they get benefits by having the software they depend on maintained by a community. You'd think I'd get that since I've created open source software myself. I guess I didn't make the connection because my reasons weren’t driven by economics. But many open source contributors are driven by economics. It makes a lot of sense for Google to promote Firefox. Their business depends on a solid cross-platform browser that real people can use. I wish they'd do more direct development of Firefox. I still secretly hope that Google is working on an open cross-platform answer to Avalon.

11. Microsoft employees aren't evil.

I find the whole evil/non-evil discussion pointless and immature. In fact I used to be turned off by Google's whole do no evil mantra. I guess that is my own personal hang-up and it is probably because of the way I define good and evil. I prefer to think in terms of how well a company's incentives are aligned to mine. As I said, Microsoft still doesn't compare favorably to companies like Google. It isn't because they are inherently evil or non-evil. It is just because their natural corporate greed is aligned better with the natural greed of many of the other participants in their ecosystem. It's called synergy. Until Microsoft figures out a way to align its goals with mine - as a user, developer, and web based business owner - I can't trust them to do the right things for me.

12. Microsoft has good development resources

Sorry, I've had to suffer under the tyranny of MSDN for 13 years now. There is a lot of STUFF, but not much of it is useful to experienced developers. Just try to get started with Atlas. You will quickly find yourself digging through the code. And every experienced .NET developer keeps Reflector close at hand. There is a reason for that, the documentation sucks.

13. Microsoft speeds web application development

Nope. The demos look good, but ASP.NET sucks as a platform for web development. They made some very bad choices in ASP.NET 1.0. They tried to make web development like VB6 windows development. They wanted to keep developers in their comfort zone so they could easily switch back to doing Windows only development when Microsoft was able to deliver their next generation platform. It might have worked if it didn't take Microsoft 5+ years to deliver that platform. Hell I was lulled into waiting a very long time before really embracing the web as a platform. I really wanted WinForms to work. But again Microsoft killed WinForms before it was even born.

Conclusion

Many people predicted that Microsoft would completely abandon IE once they killed off Netscape. I didn't believe it at the time, but I do now. ASP.NET was a defensive move. Atlas is a defensive move. WPF/E is a defensive move. The long term strategy still depends largely on selling licenses of Windows and Office. They say they now get the web and that they are going to focus on Services. I say bullshit. I'll believe it when I see it. And I am done trusting Microsoft to do the right thing for me until I see them deriving most of their revenue from win-win-win sources.

Take the Mappoint web services as an example. Microsoft built it and the first thing they seemed to think was, how do we charge for this? How can we get our customers to let us hook them up to the Mappoint IV so we can charge them for each drip? Google looked at maps and they seemed to think, how can we help users find stuff easier while giving small advertisers a better way to advertise to their targeted customers and how can we make money doing that? It is not a question of good and evil. It is question of focus. Do you see the web as a zero sum game? If not then be wary of Microsoft because they apparently still do.

05/23/2006 10:18:47 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 02 November 2005

Why doesn't my custom DataGridViewColumn show up in the ColumnType combo box?

I added a new class to a Windows Forms project that derives from DataGridViewColumn. I expected that type to show up in the DataGridView editor's ColumnType list. But it didn't. After spelunking in Reflector I saw that the designer uses ITypeDiscoveryService.GetTypes() to load the list of DataGridView column types. ITypeDiscoveryService.GetTypes is supposed to find all public types that derive from DataGridViewColumn in all referenced assemblies. And that is when I realized that my custom DataGridViewColumn class wasn't public because VS.NET 2005 doesn't declare classes public by default the way previous versions of VS.NET did.

That is two hours of my life that I will never get back. ;-)

11/02/2005 14:19:26 (Central Standard Time, UTC-06:00)  #    Trackback

 Sunday, 20 March 2005

Make Meaning

There really is only one question you should ask yourself before starting any new venture:

Do I want to make meaning?

Meaning is not about money, power, or prestige. It's not even about creating a fun place to work. Among the meanings of "meaning" are to

  • Make the world a better place.
  • Increase the quality of life.
  • Right a terrible wrong.
  • Prevent the end of something good.

Goals such as these are a tremendous advantage as you travel down the difficult path ahead. If you answer this question in the negative, you may still be successful, but it will be harder to become so because making meaning is the most powerful motivator there is.

It's taken me twenty yeas to come to this understanding. [Guy Kawasaki - ]

For six months I've tried to answer that question. I've thought about the times I was happiest in my life. I realize that I am happiest when I am teaching and inspiring people to do something they consider impossible. That is my purpose in life.

What should I teach? I've thought about teaching technology, but that bores me to tears. One of the few good teachers I've had was my high school chemistry/physics teacher Mr. Seela. He won Iowa Teacher of the Year for 2004 and he deserved it. Mr. Seela was one of the few people who never gave up on me in high school. No matter what I did, he kept trying to reach me. I've thought about following in his footsteps and teaching high school science. But the pay sucks. And the system sucks. Mr. Seela deserves to make 10 times what he makes but most of my other teachers should not be allowed within 100 yards of the nearest classroom.

When I was younger I dreamed of being a software developer, a Navy SEAL, or a Navy fighter pilot. At one point I planned to become all three - a real life MacGyver if you will. Eventually I realized one of my dreams. I became a software developer. I should be happy right?

Last year I realized that writing code to make someone else more profitable, by making banks more profitable, isn't that fulfilling. That's when I decided to go into business for myself. I thought that if I chose the products I worked on, I could make meaning with the products I created. But the idea of writing code in my spare time doesn't appeal to me anymore. I want new challenges. More importantly, I want to spend more time working with people than I do working with machines.

Then I remembered one of the other passions from my youth - flying. I've thought about becoming a flight instructor in the past, but like many teaching professions, the pay sucks. But I see ways to change that. And I want to change it. I want to change the way the flight training industry works. I want to find a way to make flight instructor the premium position in the aviation industry. I also want to find a way to reach kids that are like I was in school. I know that Mr. Seela could've reached me by teaching me to fly. That would have been the motivation I needed to live up to my potential in school. I know that there are kids out there like me, who need something to focus their attention. I know that through flight instruction, I can change the world. And who doesn't want to change the world?

So thanks David Seela. Thanks Guy Kawasaki. Thanks Robert Scoble. Thanks Steve Pavlina. Thanks Dave Winer. Thanks Adam Curry. Thanks Eric Sink. Thanks Jon Udell. Thanks Seth Godin. Thanks Jack Canfield. Thanks Harvey Mckay. Thanks Duane "Dog" Chapman. You have all inspired me to change the world.

03/20/2005 11:16:06 (Central Standard Time, UTC-06:00)  #    Trackback

 Sunday, 30 January 2005

Web Development Tricks: Dynamic Properties & Conditional Comments

This weekend I learned two new web development tricks.

Do you need to set the width of one element equal to the 2.78 times the width of another element? Use Dynamic Properties.

Do you need to include different code, or CSS styles, for Internet Explorer? Use Conditional Comments.

01/30/2005 12:45:56 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 20 January 2005

Markdown.NET - A .NET implementation of the Markdown text-to-HTML syntax

Milan Negovan ported the Markdown conversion tool to C#. It is BSD licensed. I can't wait to try it out! See Milan's Announcing Markdown.NET post for more information.

01/20/2005 17:58:33 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 19 January 2005

Free .NET Component Inspector from oakland software

Sometimes the best way to learn a new object model is to explore it at runtime. That is what makes OutlookSpy and OfficeSpy so valuable. The .NET Component Inspector, from oakland software, will let you do the same thing with any .NET assembly. Very cool!

01/19/2005 11:31:36 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 18 January 2005

Surviving the Visual Studio .NET Designer "Woe"

In this article mwadams describes some of the problems in the current VS.NET designers. Some of the same problems exist in the Web designers too. So far all of the Web Designer problems I've tested are fixed in VS.NET Whidbey. I hope the same is true for the Windows Form designers. But until Whidbey arrives, mwadams provides some excellent advice.

This post brought to you by the letter W and Mike Gunderloy.

01/18/2005 07:14:46 (Central Standard Time, UTC-06:00)  #    Trackback

Improve the design time experience of your .NET controls

In this article Jason Bock documents the trip he took to get a better design time experience for one of his Windows Form controls. Thanks Jason!

01/18/2005 05:50:41 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 17 January 2005

What pieces of Windows Orchestration (WinOE) will be available in Whidbey?

WinOE Workflow Prepped For Whidbey, Longhorn, Office 12 In 2006 - Just in case you're not tired of learning new APIs yet. [Mike Gunderloy]

It isn't clear what is going to be available in the Whidbey timeframe, but I definitely need to look into this further for my day job.

P.S. I didn't get around to saying this when Mike posted his 500th post. But I don't know how Mike does it. I love The Daily Grind. I would be lost without it. If something interesting happens in the Microsoft/.NET world Mike points to it. The Daily Grind is a must read for any Windows developer.

01/17/2005 17:02:36 (Central Standard Time, UTC-06:00)  #    Trackback

Using Magic Numbers in place of null in Xml interfaces

It seems that while he was sending out Double.MaxValue the system was enable to use Double.Parse() on it. In other words, he wasn't able to "roundtrip" his value, like Double.Parse(Double.MaxValue.ToString()).

...

Using Edge Cases as Magic Numbers is more evil than using Magic Numbers [Scott Hanselman]

There is a similar issue with DateTime data when you try to use simple serialization - or serialization from a class -> XSD - to build an interface. That reminds me, I need to look at how Xml Serialization works for nullable types in Whidbey.

...after a quick Google search...

It looks like Christian Weyer already answered my question in this article.

01/17/2005 16:34:53 (Central Standard Time, UTC-06:00)  #    Trackback

 Saturday, 08 January 2005

OutlookSpy: The most important Outlook development tool in my toolbox

I'd give up Visual Studio before I'd give up OutlookSpy. It is that good. Are you doing any Outlook development? Then you need OutlookSpy. It is worth 10 times more than Dmitry charges for it.

01/08/2005 10:02:26 (Central Standard Time, UTC-06:00)  #    Trackback

Does your managed Outlook Add-in stop working after a few minutes?

I'm working on a managed Outlook Add-in for my day job. I'm in the early prototyping stages and I haven't implemented a shim yet. I plan to soon, but for now I'm just trying to figure out the basics. I'm using DebugView to monitor my add-in and I've noticed that periodically another add-in is tracing from my thread. This didn't concern me until I noticed that my add-in quit working immediately after these traces. Here are the traces that I see:

[3852] APPMANAGER: Checking for updates.
[3852] APPMANAGER: New update NOT detected.

I have Lookout installed. Since Lookout doesn't use a shim it is running in the same AppDomain as my add-in. After poking around in the Lookout options dialog I found the "Automatically check for new software updates" setting on the Advanced tab. I disabled that setting and now I don't see the APPMANAGER traces. Better still, my add-in doesn't break anymore. I suspect that Lookout is calling ReleaseCOMObject as part of the process that is logging the APPMANAGER traces. That perfectly illustrates why you need a shim. But if you are just trying to take baby steps, I hope this saves you a little frustration ;-)

01/08/2005 09:50:38 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 10 November 2004

Can't change the Vault repository or edit Vault user profile

I created a new Vault repository to store a new set of projects but I could not figure out how to change the default repository that I was connecting to from Visual Studio. In fact Vault was automatically logging me in so I couldn't even edit my profiles. There was one point where Vault presented me with the login screen with the user profile portion disabled. I looked everywhere for another way to edit my user profiles but I couldn't find anything other than the disabled controls on the login screen. After staring at the disabled controls for about 48 minutes I finally stumbled across the solution. I hope this helps someone else.

Here is how you can edit your Vault user profiles when they are disabled on the login screen. First start the Vault Client. Choose File -> Disconnect from Server. Next choose File -> Connect to Server... You should then see the login screen and the Edit Profiles button should be enabled.

Note: This post refers to Vault 2.0

11/10/2004 22:11:34 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 04 November 2004

Embedded databases for .NET

I need an embedded database for a new .NET project. I've used SQLite for several small Python projects and I've been pleased with the results. There are several ways to use SQLite from .NET but I think I'm going to give VistaDB a try first. I like supporting small ISVs and for this project I would like to use supported product. I'll post my thoughts once I've had a chance to play with VistaDB 2.0.

11/04/2004 09:53:06 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 02 September 2004

How will Microsoft deploy Avalon to Windows XP machines? Part II

WinFX redist size

I'm amazed that Ryan knows the size of our redist - I don't yet. We are still working out the plans here, but at the moment I can't comment on how big I think the redist will be. Having fear over the "possible" redist size seems like a bad way to make decisions. [Chris Anderson]

Fair enough. But we do know that it will be at least the size of the .NET Framework since it will require the .NET Framework right? And we know that the current version of the .NET Framework is 23698 KB. Let us assume that the WinFX will be at least the same size.

Now here's the question again. How will Microsoft deploy Avalon to Windows XP machines? So far Microsoft has failed to deploy the .NET Framework to Windows XP machines. How will the WinFX be different?

09/02/2004 09:09:30 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 01 September 2004

How will Microsoft deploy Avalon to Windows XP machines?

I love Eric Sink's new Winnable Solitaire experiment. Eric is an amazing writer and he strikes me as an all around good guy. He is another one of my role models.

Did you notice that Winnable Solitaire doesn't require the .NET Framework? Eric has discussed this dilemma before.

I know Robert Scoble is paying attention to this issue but is he missing the point? Hey Robert, take your camera over to the Avalon team and get an answer to this question. How will Microsoft deploy Avalon to Windows XP machines? And the answer had better be good because if I created a product like Winnable Solitaire today I'd use Delphi or C++. I'd like to create a product in the Avalon time frame. Will I be able to choose Avalon or will Microsoft drop the ball again and force me to use Delphi or C++ instead?

09/01/2004 21:37:28 (Central Standard Time, UTC-06:00)  #    Trackback

Is Avalon too complex?

I'm thrilled that Avalon will be available on Windows XP. But at the same time I'm worried about Avalon's complexity. It feels a bit too much like a boil the ocean strategy. I've tried approaching Xaml the same way I approached Html years ago. I haven't gotten very far. Xaml feels more like Xml Schema than Html. To be fair I can't really compare Xaml to Html without including CSS. CSS is very difficult to learn too. But look how much you can accomplish with just Html. Most people ignore CSS until they feel they need it. Will it be possible to ignore Xaml's more complex pieces the same way?

Some say the tools will save us. Except the same thing was supposed to happen for Xml Schema and WSDL. In fact we're still waiting for the tools to save us there. We all know that contract first design is the right way to do web services but how many people are actually doing that? And what does that get you anyway? You still have to test on every platform you think you might want to interoperate with. Isn't it easier to just whip up a simple interface class in your favorite language and then begin the test/tweak cycle from the WSDL it generates?

I can't help wonder where we'd be if Microsoft had taken a more incremental approach. What would it be like to write client-side code in a <script language="C#" runat="client"/> element? What could we do with a client-side platform that supported <WinForm:TextBox runat="client"/> the way ASP.NET supports <asp:TextBox runat="server"/>? Wouldn't a platform like that be good enough? Couldn't they add support for <Xaml:TextBox runat="client"/> later?

Luxor XUL and BlackConnect look interesting. But will they be able to produce something that I would want to use all day every day? Will Microsoft be able to?

Jon Udell wants a cross-platform solution. I'd love that too. I just can't imagine what that would look like. It sure doesn't seem like we'll get one any time soon.

09/01/2004 20:51:44 (Central Standard Time, UTC-06:00)  #    Trackback

 Friday, 28 May 2004

How to install the Microsoft Visual Studio 2005 Community Technology Preview May 2004 on the WinHEC build of Longhorn (build 4074)

Miguel Jimenez provides a workaround to get the May March Visual Studio 2005 Community Technology Preview to work on the WinHEC build of Longhorn. I'm linking to it because I had a hard time finding it on Google.

Be sure to read the readme because you need to provide a config file for Explorer.exe after you install VS.NET. This is not encouraged by anyone at Microsoft of course. See the Technorati link cosmos for all the gory details. I created an Undo disk on my Longhorn Virtual PC before installing to be safe.

Update: I guess I should've tried it before I posted. Miguel's patch is for the March Visual Studio 2005 Community Technology Preview. I'm guessing the same thing should work for the May preview though. I'm working on it and I'll post a patch if I can get it to work.

Update: I was able to install the May VS.NET Preview on Longhorn without any changes. But the Longhorn SDK installs its templates and schemas to the wrong location. I moved the Longhorn templates and schemas but then I had to update the project templates before they'd work with VS.NET. Once I did that I still didn't have XAML intellisense because the XAML schema appears incompatible with this build of VS.NET.

05/28/2004 18:44:03 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 24 May 2004

WSE 2.0 is finally available

Why is it such a great day to create interoperable services, you ask? Well, (thanks to Dave Bettin for the heads up!) because WSE 2.0 has been released.

Let's just say that I'm extremely excited. Take-a-day-off-and-play-with-this excited. I'm a bit bummed that there's no WS-RM, but at this point, you won't hear me complain. I will say this -- I really hope that the next release cycle is a wee bit shorter. WSE is the "speedboat" for WS-I early adopters -- here's to hoping it stays off the reef. [Philip Rieck]

This excellent news! I'm glad this didn't happen on Friday when I was looking/hoping/praying for new WSE 2.0 bits because I wouldn't have gotten anything done this weekend.

05/24/2004 13:10:03 (Central Standard Time, UTC-06:00)  #    Trackback

 Sunday, 08 February 2004

How will tagging in WinFS help?

Why will tagging 100 photos with 'Wedding' make things magically better than having the photo's in a 'Wedding' directory ? I'm not seeing how WinFS is going to make my life (as a user not a developer) any better, what am I missing ? [Simon Fell]

But are you sure you want it in a 'Wedding' folder? Maybe someday you will want to see all the photos that contain your wife's family. Or maybe you'll want to see all the photos that were taken in Missouri. Or maybe you'll want to see all the photos that include your children.

I use Adobe Photoshop Album 2.0. It let's you tag your photos in the same way WinFS will. A couple of years ago we went to my sister in-law's wedding in Missouri. Our kids were in the wedding and we took lots of photos. When I added those photos to Album I tagged them with the following tags:

  • Events->Weddings
  • Places->Missouri
  • People->Family->Our Kids
  • People->Family->Chris' Family

Note: Album supports hierarchical tags hence the People->Family->Our Kids etc.

Now when I want to see photos of events that include our kids I check two check boxes and the photos from that wedding that include our kids show up along with all of the Christmas photos that include our kids.

Tagging the pictures is fairly simple. You can drag tags to your photo(s) or photo(s) to your tags. It is nothing like the tedious process of adding properties to Word documents.

Unfortunately as useful as all this metadata is it is locked away in Adobe Album. If I want to find a picture to email to my family I have to start Album and copy the file from there. I can't use that metadata from Outlook or any other program.

02/08/2004 23:56:32 (Central Standard Time, UTC-06:00)  #    Trackback

 Saturday, 07 February 2004

BackgroundWorker in Whidbey

As I have seen today in Whidbey there are a lot of new components and controls for WinForms programming. A cool new component is the System.ComponentModel.BackgroundWorker component. With this component you can initialize a new worker thread and execute them async. [Klaus Aschenbrenner]

The BackgroundWorker has several other cool features:

  • BackgroundWorker.CancelAsync() and BackgroundWorker.CancellationPending - allow you to notify the worker thread that you want to cancel the job.
  • BackgroundWorker.ReportProgress() and BackgroundWorker.ProgressChanged - the worker thread can call ReportProgress() which fires the ProgressChanged event. This is handy for updating progress bars etc. The ProgressChanged event handler runs on the original thread not the worker thread.
  • BackgroundWorker.RunWorkerCompleted - this event is fired when the worker thread is finished. It reports the final status of the worker thread, any exceptions that were thrown, and any resulting data from the worker thread. The RunWorkerCompleted event handler also runs on the original thread.

Since the only code that runs on the worker thread is the DoWork event handler you only have to worry about thread safety there. Since the BackgroundWorker provides for progress reporting, returning results, canceling, catching exceptions, etc. you can focus on the business logic in the DoWork event handler.

The pattern that the BackgroundWorker defines is used throughout Whidbey. The new Web Service Proxy classes will use it so Web Service calls will be asynchronous by default. The new Asynchronous ADO.NET functionality uses the BackgroundWorker pattern too.

02/07/2004 22:35:24 (Central Standard Time, UTC-06:00)  #    Trackback

Give us a merge module!

I swear if I have to deal with one more flaky-doesn't play well with others-piece of shit redistributable installer, I will find the dolts responsible and beat them about the head and shoulders with a fucking cheese grater!!!

I'm currently being tortured by Sun's JRE installer but Microsoft is just as guilty.

02/07/2004 06:09:29 (Central Standard Time, UTC-06:00)  #    Trackback

 Wednesday, 04 February 2004

Compiling the VNC Client with J#

Today a coworker of mine discovered that you can compile the 1.2.9 Java VNC client with J#. Here's the J# project to prove it.

02/04/2004 22:15:08 (Central Standard Time, UTC-06:00)  #    Trackback

Registering .NET Assemblies during installation

Sometimes setting the Register property to vsdrpCOM or vsdraCOM in a Visual Studio .NET setup & deployment project doesn't work. For some reason your assembly isn't registered. When this happens use a Custom Action to register your assembly manually. Ian Turner describes how in a post to the Microsoft newsgroups.

02/04/2004 21:20:04 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 03 February 2004

ClickOnce Activation Errors on Windows 2003

ClickOnce errors are less than helpful in the PDC build of Whidbey. Shawn Farkas says that will improve in future builds. I'm looking forward to see what they come up with.

In the mean time if you are trying to publish your ClickOnce application to a Windows 2003 server and you are having problems check the following:

  • Make sure anonymous users can access the site and the files contained therein. By default Windows 2003 doesn't enable anonymous access to new web sites. The anonymous user - IUSER* - doesn't have access to the files on an NTFS volume by default either.
  • IIS on Windows 2003 doesn't serve file types it doesn't know about. Make sure you have the .deploy and .manifest extensions mapped to the "application/deployment" mime type.
  • If you are using Virtual PC make sure your virtual network adapter is mapped to the correct real network adaptor. I had my virtual adapter mapped to my wired Ethernet adapter but I was connected to my wireless network. For reasons I don't fully understand, ClickOnce failed until I mapped my virtual adapter to my wireless adapter.

For more information see Duncan Mackenzie's excellent Introducing Client Application Deployment with "ClickOnce" article.

Update: Apparently it is a well known issue that ClickOnce in the PDC build of Whidbey does not work when you are offline. You'll know this is your problem because you'll see "Download failed because the machine is offline." in your deployment log (see Shawn's post for the location of the log). If you have Whidbey installed on a Virtual PC machine you can easily work around this problem by setting your virtual network adapter to "Local only" or "Shared networking (NAT)". Then ClickOnce will work even when your host machine is disconnected.

02/03/2004 21:45:06 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 02 February 2004

"Best Practices"?

Is there something I'm missing here? Why call Close() and Dispose()?

If there isn't any good reason, then I'm wondering just how much the authors of the book could really know about "best practices." Is it a best practice to include unnecessary code just because you've never bothered to look at how Dispose() works? [Rory Blyth]

I recently saw someone add something like the following in a demo:

SqlConnection connection = new SqlConnection(...);
connection.Open();
// code that uses the connection...
connection.Close();

He then said that in production code he should really protect that in a try...catch but it was easier and cleaner to skip the try catch for the demo. I always wonder why people don't just do this:

using(SqlConnection connection = new SqlConnection(...))
{
// code that uses the connection...
}

Of course true best practices dictate your code should only touch connections in a single location*. But I digress.

* I didn't say I follow this best practice all the time...but I digress even further. ;-)

02/02/2004 14:41:28 (Central Standard Time, UTC-06:00)  #    Trackback

How will Longhorn help?

Chris Sells talks about the sorry state of software design.

Hopefully Longhorn will fix this problem, but until then, I recommended that she return her computer to the manufacturer and get herself a Nintendo. After working through this with her for over an hour, I was only half kidding.

I'd love to see Chris, Robert, & anyone else on the Longhorn team answer the following question for each of the problems Chris described: How will Longhorn help make this easier for normal human beings?

02/02/2004 12:36:27 (Central Standard Time, UTC-06:00)  #    Trackback

 Friday, 30 January 2004

.NET Rocks! Live! continued...

Rory did a good job. He sounded nervous at first and he didn't say much the first hour. But he loosened up during the second half of the show and he did a great job.

I hope they take questions via IM or email in the future. I had several comments and questions but I wasn't in a position to use Skype.

Daniel Appleman's sounds like a must-read for parents and teens. The entire second half of the show reminded me of a book I heard about a few years ago on TechTV. I think would perfectly complement .

Overall this was one of my favorite .NET Rocks! show. Keep up the good work guys.

01/30/2004 22:48:50 (Central Standard Time, UTC-06:00)  #    Trackback

.NET Rocks! Live!

I'm listening to the first .NET Rocks! Live! show. So far I really like the format. I think having the back and forth worked great. It was great to hear Reporting Services compared and contrasted to ActiveReports.

I wish Bill Vaughn and Peter Blackburn had spent more time talking about the limitations of this first version of Reporting Services though. There are a lot of limitations in V1 and that didn't come across.

01/30/2004 14:22:40 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 29 January 2004

Storing user settings in Whidbey

As a user I expect applications to remember my preferences. If I move a dialog 30 pixels from the top of the screen, then by God, I expect that dialog to be 30 pixels from the top of the screen the next time I open it. I'm offended every time an application ignores one of my many foibles. I mean come on, even Notepad remembers where I last left it.

As a developer I know it isn't as easy as the heroic Notepad developers made it look. But Whidbey promises to change that with the new SettingsBase, SettingsProvider, and ApplicationSettingsBase classes.

Raghavendra Prabhu described the new application settings functionality coming in Whidbey this way:

At the core of the new settings infrastructure are two abstract classes - SettingsBase and SettingsProvider. SettingsBase hides the storage details way from the application developer and exposes an easy mechanism to read, write and save settings. The SettingsProvider is actually responsible for persisting these settings. How it persists them is its own choice - SettingsBase doesn't really care. It could store them in a local file, a database or retrieve them through a web service, for example. In each case, the way the application consumes the settings remains the same. The ApplicationSettingsBase class Joe talked about adds value to SettingsBase and makes it even more easier to create your own settings. Essentially, in this model, settings are nothing but properties on classes that are decorated with a bunch of attributes that describe them.

In a comment Bryan said he wants the platform to do more:

Better than the precedent but over engineered. Dynamic properties should have one of three options associated with them: 1 - don't persist (sames as 1.1); 2 - persist for user; and 3 persist for machine. Yes, it can be that simple and should have been all along. You add a dynamic property, specify the key (default generated for you), and select a persistance option. Is it possible that VS could do this for you using the model above? I don't want to add even that little bit of code for every form that I want to save the user's last entries on.

In a response that brought tears of joy to my eyes Raghavendra said:

Bryan: Yes, we are working on design time support in VS for this feature. In typical cases, you won't need to write a single line of code to create settings and bind them to properties. Creating settings will be as simple as specifying a few things about each setting (like name, type and scope) through a UI.

P.S. Joe Stegman demoed the manual use of the ApplicationSettingsBase class in his Windows Forms: New Features in the .NET Framework "Whidbey" Release talk. You should check his talk out if you're having a difficult time imagining how this stuff will work.

01/29/2004 15:21:22 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 04 November 2003

Replace and defend

At this point in the project, I can't imagine a worse approach. XML Schema has already eclipsed C++ in terms of complexity. Adding yet another layer on top to model WinFS-isms not directly expressible would only make matters worse.

For now, I'm happy that WinFS schemas are tailored to their data model - having to weed through the complexities of element substitution groups and schema redefines would only obscure what's really going on. [Don Box]

I agree. I assume that a lot of the WinFS work is based on earlier My Services SDK work. The My Services SDK schemas were awful. They used appinfo annotations extensively. They couldn't be used to validate instance documents because much of the validation would need to use the data stored in the annotation. They also used a lot of XSD features that current tools don't understand. For the same reasons the My Services schemas couldn't be used to map from the XML to a different type system.

If Microsoft had used XSD they would just be teasing me. They would also be accused of "embrace and extend". I'd rather see Microsoft "replace and defend" if it makes my life easier. I wish they hadn't used XSD for the .NET DataSet and I'm glad they aren't using it for WinFS.

11/04/2003 11:04:26 (Central Standard Time, UTC-06:00)  #    Trackback

 Tuesday, 28 October 2003

Microsoft has its act together this time

I didn't go to the 2000 PDC. I stayed on the sidelines and watched the news trickle back. That sucked. Microsoft was not prepared to involve the rest of the developer community. Details were sketchy. Code names changed at the last minute so it was impossible to search for information. MSDN online was a joke at the time. There was no coherent message coming from Microsoft.

This PDC is different. Microsoft is doing an amazing job. I've got access to more information than I ever expected. I did not expect the SDK to go live the day of the keynote. Hell I didn't expect it to go live for another 6 months. The MSDN team should be proud. The Microsoft marketing team should be proud. You guys are doing a great job including us poor bastards who can't attend.

Now for my one gripe. You guys need to get the bits up on MSDN downloads sooner rather than later. That is supposed to be the point of MSDN subscriptions - early access to Microsoft technologies and information. I'm glad you are making the bits available to MSDN subscribers this time - that was the most frustrating thing about the 2000 PDC. But you need to speed up the process. I want the bits! Give me the bits! ;-)

10/28/2003 10:33:22 (Central Standard Time, UTC-06:00)  #    Trackback

WeblogAddress

10/28/2003 02:23:01 (Central Standard Time, UTC-06:00)  #    Trackback

Where oh where did Hailstorm go?

It appears many of the good ideas in Hailstorm found their way into WinFS.

10/28/2003 01:27:28 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 27 October 2003

InfoPath Designer

After that I swung into "Building InfoPath Solution Using Managed Code: Drill Down" to see how I could use VS.NET with InfoPath. Until now you had to do event handlers using JScript instead of managed .NET code. They demoed the new InfoPath Toolkit for Visual Studio. Very nice.

It is similar to the Office Toolkit for Visual Studio in that it provides new project templates for InfoPath projects. When you fire one of these projects up, it adds a new InfoPath file and a class library project and hooks it all up. When you double click on the InfoPath file it opens the InfoPath designer.

But here's the cool part... when you add an event handler in the InfoPath designer, it pops back to VS.NET and adds a method to your class. Compile and run and the form opens up and the event handler is all hooked up! This is what I was expecting when I first saw InfoPath. Now, many months later I get it.

Or, at least I think I get it. I haven't checked the CDs yet. [Peter Provost]

This sounds ok but I doubt it will help InfoPath adoption. After all, it doesn't give you any new capabilities, it just hides some of the ugliness behind the IDE. I doubt that I'll even look at it because it depends on COM Interop. That sucks. One of the coolest things about InfoPath and .NET separately are the no-touch deployment models they support. Requiring COM Interop defeats the purpose.

Until Microsoft ships a version of InfoPath that hosts the CLR I'm not interested.

10/27/2003 22:10:44 (Central Standard Time, UTC-06:00)  #    Trackback

Is XAML InfoPath++?

After looking at XAML for less than 2 hours I've decided that it is exactly what I wanted InfoPath to be. I wonder how much the InfoPath project influenced XAML. Does InfoPath have a future?
10/27/2003 18:56:10 (Central Standard Time, UTC-06:00)  #    Trackback

Am I First?

Robert, am I the first person to beg you for something using XAML?
10/27/2003 18:41:55 (Central Standard Time, UTC-06:00)  #    Trackback

Longhorn SDK Comments

I just noticed that each page in the SDK has a place for comments. The comments are apparently powered by the Annotations Service. The Annotations Service uses RSS 2.0.

The comments should be useful for learning the new APIs.

10/27/2003 18:13:24 (Central Standard Time, UTC-06:00)  #    Trackback

Longhorn SDK

The Longhorn SDK is up. Yeah baby!
10/27/2003 17:58:26 (Central Standard Time, UTC-06:00)  #    Trackback

More Modular Release of .NET Framework 2.0

I hope the right people at Microsoft hear this message loud and clear:

Last night, I was sitting with a few PMs for various pieces of the Framework and I asked the question: Why do I have to wait for framework improvements, like the new XsltProcessor, XPathDocument or the improvements in System.Net until the IDE support for ASP.NET or even Yukon are done? Why isn’t there a more modular release strategy, like the one employed by WSE, that allows to release pieces of the framework on their own? [ChristophDotNet]

10/27/2003 16:04:45 (Central Standard Time, UTC-06:00)  #    Trackback

 Saturday, 25 October 2003

UpdateVersion 1.2 Available

There is a new version of UpdateVersion available. It includes several new features courtesy of Mike Gunderloy and Scott Hanselman. About a year ago Mike added a "Pin" feature that causes UpdateVersion to use a version that you specify on the command line. This week Scott made several improvements that finally forced me to get off my butt and release a new version. Check out the documentation for a complete list of all the new goodness.
10/25/2003 23:07:06 (Central Standard Time, UTC-06:00)  #    Trackback

 Monday, 09 June 2003

Why should I use ASMX?

I asked myself at 1 am. For a project I'm working on I want complete control of the WSDL, XSD, and XML that my service produces and consumes. Until 1 am this morning I thought using XmlElement with ASMX would do what I want.

Today I reread Tim Ewald's article. Tim asks "Why Access Raw SOAP Messages?" For me the question is: Why should I use ASMX? Wouldn't it be easier to create my own HttpHandler?

Tim answers that question at the end of the article:

You may wonder why I didn't just go all the way and implement my own Web Service endpoint using a low-level HTTP handler (an approach I've taken in the past). The main advantage of working inside (for the most part) of the ASP.NET Web Services infrastructure is that you can also take advantage of the object abstraction when it makes sense, a feature that a plain vanilla HTTP solution wouldn't provide.

06/09/2003 17:56:03 (Central Standard Time, UTC-06:00)  #    Trackback

 Thursday, 05 June 2003

Unifying Tables, Objects and Documents

Button b = <Button>
        <Text>Click Me</Text>
    </Button>;

creates an instance of the standard Button class and sets its Text field to the string "Click Me".

That stopped me dead in my tracks. You should read this now. Thanks for the link Jeff.

06/05/2003 23:42:57 (Central Standard Time, UTC-06:00)  #    Trackback

 Saturday, 31 May 2003

Not Getting It

For heavy authoring and graphics and so on, you need a native application. But a huge majority of business data processing is you interacting with a database off on a server somewhere, and as far as I can see, a Web Browser is still the best way to do that. WinForms? Pshaw! [Tim Bray]

Since Tim spends 8 hours a day doing business data processing he should know. Sorry, there are many things that a browser does best but that is not one of them.

If the browser is the best environment for data processing and if users prefer the browser why aren't Quicken, Money, TurboTax, and other consumer data processing applications browser based? Sure they borrow heavily from the browser - they have links and back buttons and flow layout - but when it comes time to do the actual data processing they all have features that are impossible to achieve in the browser. Heavy data processing is no different then heavy authoring and graphics.

05/31/2003 09:47:26 (Central Standard Time, UTC-06:00)  #    Trackback

 Friday, 16 May 2003

The Portability Myth

One surely can't expect a very complex C# Winforms app with all the fancy GDI+ goodness and user controls to just work on Mono (unless I'm missing something) - not until the underlying implementation of ALL the dependancies are written and implementing using whatever Linux OS primitives are required to provide these features. [Scott Hanselman]

This is TouchGraph version 1.22-jre1.1 compiled with J# 1.1 and running as a .NET 1.1 Windows application:

TouchGraph version 1.22-jre1.1 running as a .NET Windows application

The fact that it compiled without modification is surprising. The fact that it runs error free is amazing. The fact that it is slower than molasses in January is understandable.

I imagine I will feel the same way IF the Mono team is ever able to get something like Vault running.

05/16/2003 21:49:09 (Central Standard Time, UTC-06:00)  #    Trackback