Saturday, November 26, 2011

Google Maps slows down on Saturday mornings

I use Google Maps a lot to investigate a place I'm going to or to get direction there. I've noticed that on Saturday mornings Google Maps often fails but when it doesn't it just runs real slow and I get this message:

Still loading... Slow? Use the troubleshooting guide or basic HTML.

My theory is that everyone is using Google Maps to find directions before or while going out on a Saturday and this is overloading their maps servers. When are you most likely going to need directions? When you go somewhere you haven't been to in a while or ever. That's not going to happen during the week when you're going to and from work, that's going to happen on the weekend when you have the time to adventure off the beaten path. So my guess is that the requests to the mapping service peaks over the weekend.

Monday, October 17, 2011

AdWords AdSense Arbitrage

 I've heard about people doing AdWords/AdSense Arbitrage but I question if it's even possible.

First of all, what is arbitrage? The classic definition of arbitrage has someone, usually a trader in the stock markets, buy and sell a financial instrument at exactly the same time such that there is zero risk and an instant profit. Essentially you are the middle-man who has a buyer and seller lined up and you pass the item being sold from one to the other and pull in the difference.

With AdWords/AdSense arbitrage it's a little different because it is by no means risk free and it does not take place at the same time. The idea is that you buy traffic using AdWords and then you sell it on to another site using AdSense and the rate that you buy it at is lower than the rate that you sell it at.

Let's look at the math involved.

Google keeps 32% of the revenue earned from a click on an advert on your site (assumes AdSense-for-Content). So if you pay $1 through AdWords to bring a visitor to your site you need to earn $1.47 from AdSense from that visitor in order to break-even. So far this is not impossible but there's a lot of competition out there and you have to assume that your visitors are looking for the same type of item so advertising rates should be similar. It also assumes that 100% of visitors arriving through your AdWords campaigns click on an AdSense advert.

Now let's imagine that only 10% of your visitors that arrive from your AdWords campaigns click on one of your AdSense adverts. At our example rate of $1/visitor you have spent $10 to get someone to click on your AdSense advert and you need to make sure that you earn $14.71 from that click to break-even. That's a huge jump from a $1 Adwords campaign to an AdSense advert.

This table shows you how much you have to earn per AdSense click in order to break-even based on the click-through-rate based on $1/AdWords click:

CTR AdSense
1 147.06
5 29.41
10 14.71
20 7.35
50 2.94
75 1.96
100 1.47

This is why I think that AdWords/AdSense Arbitrage is almost impossible. 

Wednesday, September 14, 2011

Gigabit networks with low quality cables

I have 2 computers attached to a Gigabit switch and both these computers have NICs that support 1 GBPS speed. However, I noticed that transferring files between them was slow so when I inspected the speeds I saw that one of them was operating at 100MBPS and the other at 1GBPS.

At first I went into the driver config setup and fiddled around with half and full duplex to see if that would make a difference. After a bit more research I found a comment that a network cable might be the culprit and that some network cables (e.g. Cat 5) do not support speeds over 100MBPS. I didn't believe that could possibly be the problem but it was easy to test so I switch the cables and the slow speed moved with the cable. So replacing the cable did the trick. Apparently Cat 5e and Cat 6 are what you need to get the higher speeds.

I'm now transferring data at 10 times the original speed.

Mouse without Borders

Do you have 2 or more PC's on your desk that each have their own keyboard and mouse? Do you want to control it all from one keyboard and mouse without using a KVM and have the mouse float from one monitor to the next and allow copy and paste between them? Then Mouse without Borders is for you:

http://aka.ms/MouseWithOutBorders

I've been using it for a few days now to link my two desktops together and it's working very well. The only hiccough is that I need to sign in to my primary computer first and then the secondary computer second (with its keyboard) if I want them to work together. After that it's all controlled from the primary keyboard and mouse. If I sign in in the other order then I have to use their own keyboards and mice to control them.

Wednesday, August 24, 2011

Collection was modified; enumeration operation may not execute.

This .Net error typically occurs when the underlying collection is modified during enumeration. For example if you remove items from the collection while enumerating the collection.

It can also happen when you're accessing a collection while adding to the collection. See if you can see the bug in the following code which is in the Menus accessor in a singleton object.

if (_Menus == null)
{
    lock (lockObj)
    {
        if (_Menus == null)
        {
            _Menus = new List<MenuItem>();
            _Menus.Add(new MenuItem
            {
                Link = "/mylink",
                AnchorText = "MyText",
                TitleText = "My Title"
            });
            // ... add more items
        }
    }
}
return _Menus;

Note above that we are correctly doing the double null check. Also, not visible in the code we have marked the _Menus item as volatile to prevent the compiler from removing the seemingly redundant double null check.

Here is the fixed code:

if (_Menus == null)
{
    lock (lockObj)
    {
        if (_Menus == null)
        {
            List<MenuItem> tempMenus = new List<MenuItem>();
            tempMenus.Add(new MenuItem
            {
                Link = "/mylink",
                AnchorText = "MyText",
                TitleText = "My Title"
            });
            // ... add more items
           _Menus = tempMenus;
        }
    }
}
return _Menus;

The solution, as demonstrated, is to create a locally scoped temporary collector and build that before assigning it to the _Menus variable. That way all other requests for this object will hold on the lock until this has completed building and then won't attempt to build it because it already exists as determined by the second null check.
 

 

Sunday, July 31, 2011

Creating a composite key in your container

If you need to create a composite key (for a dictionary for example) where the natural key could come from multiple sources a standard technique is to prepend the source to the natural key to ensure that it is unique. However, this can still cause duplicate keys so to solve this problem you should always add a delimiter between any concatenations that you do.

The problem is easily illustrated by looking at how files are managed in folders. Typically the composite key is the fully directory path and file name which identifies the file you are looking for. The delimiter is the backslash.

Say you had 2 files: bat.txt and mybat.txt

Let's say that they were in 2 directories:
c:\temp\somy and c:\temp\so
i.e.
c:\temp\somy\bat.txt
and
c:\temp\so\mybat.txt

Without those backslashes we would have:
c:tempsomybat.txt
and
c:tempsomybat.txt
giving us the same key.

So in code you may have:

public void AddItem(string source, string itemName,
  object data, Dictionary<string, object> myDictionary)
{
     string compositeKey = source + "+" + itemName;
     myDictionary.Add(compositeKey, data);
}
 

Tuesday, July 26, 2011

Disable thumbs.db in Win7

Notes for self on how to do this.

  1. Start Local Group Policy Editor.
    1. From start menu type gpedit.msc and hit enter.
  2. Navigate the left pane:
    1. User Configuration
    2. Administrative Templates
    3. Windows Components
    4. Windows Explorer
  3. On right hand side find:
    1. Turn off the caching of thumbnails in hidden thumbs.dg files
    2. Double click this item.
    3. Change value to Enabled

Other notes:

Might need to log off and log on again.

Will not delete existing thumbs.db file.

 

Sunday, June 5, 2011

Time taken for Google to de-index 301 redirect pages

I changed a bunch of the URL's on a site that's in the Alexa to 100,000 - i.e. a moderately busy site that the Google bot visits every day. The old URL's now do a 301 redirected to the new structure. At the time that I did the switch over and every day for the next 60 days I took a snapshot of how many URL's where indexed by Google in each section. I used the following search command in Google:
inurl:/old/folder/pattern site:mysite.com
I entered the number of URL's into a spreadsheet for each of 3 folder patterns. Each pattern started off with 13, 18, and 87 URL's in Google's index. The objective of the exercise was to see how long it would take Google to de-index these pages. Here is a chart of the results:




The folder pattern with 87 URL's is shown against the right axis and the other two against the left.
Expectations:
My expectation was that as soon as Google found the new URL's (it found almost all of them within 5 days) that it would rapidly de-index the old URL's. Remember that I'm telling Google that this is a permanent (301) not temporary (302) redirect.
Actual results:
  1. It took around 55 days to naturally de-index all the pages. Much longer than I was expecting.
  2. The de-indexing for the 2 smaller collections of pages was linear.
  3. The de-indexing for the larger collection of folders was sudden and this happened after 48 days.
There are other techniques for de-indexing pages from Google. For example, Google's Webmaster Tools has a place for you to enter the URL's you want to remove and you can also add the pattern to your robots.txt file which might have de-indexed them faster. The objective of this exercise was not to rapidly de-index those pages but to see how Google naturally de-indexed them over time when given a 301 redirect directive.
My surprise is how long it took to do that.
I'm not going to show a chart of the indexing of the new URL's because it's exactly as you would expect with the line rising rapidly up to the previous values. As I mentioned, 93% of the new links had been indexed within 5 days of them appear on the site and 100% had been indexed by day 13.

Tuesday, May 31, 2011

Loading a web page in a browser

Tony Gentilcore has just written a good post on How a web page loads and why blocking needs to take place when scripts and CSS load. I'm hoping that he's going to follow this up with a post on the SPDY protocol which looks pretty interesting for a faster web.

Google say that SPDY is an experiment with protocols for the web. Its goal is to reduce the latency of web pages.

At the time of writing this the third draft of the specification for SPDY could be found here.

One of the advantages that SPDY provides is that the resources needed by the page (JavaScript, images, CSS etc.) can be sent to the client in a compressed header and requested by the client before the client parses the HTML.

There's no indication if Google are going to push this specification yet because it's still experimental. One of the great advantages about owning an increasingly popular browser (Chrome) and a couple of popular web sites is that you can define the transport layer when your users are using your software end-to-end.

Tuesday, April 19, 2011

Optimizing CSS in ASP.NET MVC

 The CSS files in a web application don't change very often. For this reason they are usually cached by the browser to improve performance. These are the steps that I take to optimize CSS files from my web apps. Although I'm using ASP.NET MVC this can be applied to any web app written in any language on any platform.

  1. Combine all CSS files into a single file.
  2. Replace tokens.
  3. Minify the combined CSS file.
  4. Cache the CSS file on the server.
  5. Name the CSS file with a timestamp.
  6. Compress the output to the browser.
  7. Set the browser cache date to a far future date.

1. Combine all CSS files

To make a project more manageable you might want to maintain multiple CSS files. I setup a pattern so that I can easily combine the files without changing the code. For example, precede each CSS file with a 2 digit number that dictates the order in which they should be combined:

01_main.css
02_forums.css
03_footers.css

The first time your CSS file is requested your code will read all the CSS files from the CSS folder and combine them in alphabetical order.

Complications: Sometimes you might want another css file added in there based on browser. For example, you might have another CSS file you only want combined with the "numbered" css files if the browser is IE6 or IE7. In cases such as that you would create several different combined files on the server and serve up the appropriate one at runtime.

At the end of this step you have an in-memory string representing the combined CSS files.

2. Replace tokens

This is an optional step but you might have image URLs in your CSS that reside on different subdomains depending on your environment. For example, in my dev and test environments the images are in a path off the dev and test domains. However, in production, the images are on one or more sub-domains in order to parallelize the image downloads in the browser.

If your configuration follows this pattern then you might want to put tokens in the CSS files which are replaced at runtime depending on the environment.

3. Minify the combined CSS file

Run the combined CSS file through a minifier to make it as small as possible. This file is only going to be read and used by a browser so pretty formatting and extra whitespace is no longer needed.

If you're developing on the .NET platform then you can use the YUI Compressor for .NET to minify your CSS. Once you've included the library in your project it will require a single line of code:

string minifiedCSS = CssCompressor.Compress(combinedCSS.ToString());

4. Cache the CSS file on the server

Once you've built and minified the CSS file (steps 1 and 2) you will probably want to cache this on the server so you don't have to do this again. How you cache it is up to you. ASP.NET has a cache object you can put it in or you can use a static variable, or singleton etc.

5. Name the CSS file with a timestamp

In the <head> section of your HTML you need to reference your CSS file. I do this by referencing a controller and action with a timestamped filename as the single parameter. For example, the served up HTML will look something like this:

<head>
...
<link href="/site/css/20110418095157.css" type="text/css" rel="stylesheet" />
...
</head>

In my .master file for the site or _Layout file for my razor views I have the following code:

<link href="<%:MySingleton.Instance.GetCssFileName() %>" type="text/css"
rel="stylesheet" />

The first time that the static MySingleton.Instance.GetCssFileName() function is called it pulls all of the relevant CSS files and finds the one with the most recent date. Using that date it constructs the filename using the following .NET format pattern: yyyyMMddHHmmss. This computed filename is also cached so that it is only ever created once.
 
6. Compress the output to the browser
 
I do this by telling IIS to compress all output to the browser. How to enable compression on IIS6 and IIS7. If you don't want to enable compression on IIS then you can add a filter (attribute) to your action.
 
7. Set the browser cache date to a far future date
 
In the MVC Action that returns the CSS file set the HTTP headers to be far in the future. You never have to expire this CSS file because the name will change which will force a fresh request of the CSS files after one of them has been modified. Here is the type of code you would add to the start of your Action method to set the HTTP headers appropriately.
 
TimeSpan duration = TimeSpan.FromSeconds(15552000);
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetExpires(DateTime.Now.Add(duration));
Response.Cache.SetMaxAge(duration);
 
 
 

Wednesday, February 9, 2011

ASP.NET MVC3 Razor Notes - Partial Views

What's the difference between Html.RenderPartial() and Html.Partial()?

Html.RenderPartial() is an HTML helper method that was introduced in MVC1 and writes directly into the response object.

Html.Partial() is an HTML helper method that was introduced in MVC2 and like the other HTML helper methods returns a string and does not write into the response object.

Both helpers are used with partial views (sometimes still called controls) to implement the DRY principal of shared View code across multiple pages.

In the MVC WebForms View Engine views have the extension .aspx and partial views .ascx. This makes it easy to distinguish between views and partial views. In the MVC Razor View Engine both views and partial views have the extension of .cshtml. In order to easily distinguish between a view and a partial view in your project I suggest that you precede the partial views name with an underscore.

ASP.NET MVC3 Razor Notes - view data

The WebViewPage in an abstract class that inherits from the WebViewPage<TModel> generic abstract class.

Two properties of the WebViewPage<TModel> generic abstract class are:

  1. Model (dynamic)
  2. ViewData (key/value collection)

Among other properties, the WebViewPage adds:

  1. ViewBag (dynamic)

The ViewBag property is another way to get to the ViewData data.

For example:

If in the controller you did the following:

ViewData["website"] = "guyellisrocks.com";
ViewData["somenumbers"] = Enumerable.Range(0, 10);

Then in the view you could access that data through the ViewBag:

<div>
    @ViewBag.website
</div>
@foreach (int i in ViewBag.somenumbers)
{
<div>
    @i
</div>
}

and this would emit to the web page:

guyellisrocks.com
0
1
2
3
4
5
6
7
8
9
 

ASP.NET MVC3 Razor Notes - switching between text and code

Switching between text and code in Razor:

If you're in a code block and want to emit text you can use the <text> tag:

@{
   if(condition == true) {
      <text>this is true text</text>
   } else {
      <text>this is false text</text>
   }
}

The <text> tag will not be included in the generated HTML.

If you're in a text section and want to explicitly invoke code that could be confused as text you can wrap the code in parenthesis:

<img src="@(filename).jpg" />

If you had not put the () parens in above it would have tried to evaluate the .jpg as a property on the filename object.

 

SEO Top Negative Ranking Factors

Great search engine ranking factors article recently published SEOmoz.

Point #2 in the Top 5 Negative Ranking Factors is "Link Acquisition from Known Link Brokers/Sellers." I still find this item hard to believe although I've heard this type of statement before.

I remember attending an SEO talk where an "expert" said that links from "bad" sites to your site could harm your SEO. I asked the presenter if she was able to provide any evidence of this and she said that she could not. To date, I have not seen any evidence of this, and as I said I find it hard to believe and this is why:

A competitor could publish links to your site from "bad" sites and/or buy links to your site on a link broker's site to degrade your SEO performance and allow him/her to rise above you in search results. Are you telling me that the engineers that work on the search engine algorithms have not thought of that? These are clever guys, believe me, they've thought of this.

In my opinion, link building from "bad" sites or buying from link brokers will not have a negative impact on your SEO, it will only have a negative impact on your time and a neutral impact on your SEO. i.e. you're wasting your time by doing it as it will have no impact.

Monday, February 7, 2011

ASP.NET MVC3 does not appear as an option in VS2010 after installing it

I couldn't work out why the option to create an "ASP.NET MVC 3 Web Application" was not appearing in the list of templates when I clicked on File > New > Project... in Visual Studio 2010. Only "ASP.NET MVC 2 Web Application" was listed there.

Turns out there's a drop down to allow you to select your framework at the top of the file/new/project dialog and this was set to the .NET Framework 3.5. Because MVC3 needs .NET 4 it wasn't being listed. Changing the framework to ".NET Framework 4" solved the problem.

After doing that VS2010 will remember your last choice and default to ".NET Framework 4"...

Named parameters in a URL

Bill Brown was showing me his NLP links in Pinboard when I noticed that he was typing named parameters in the pinboard URL. I'm intimately familiar with named parameters which have been around for a long time in computer languages and have recently been introduced into C# and am a big fan of them. However, I have never seen them used (or thought of using them) in a URL before, here is an example:

http://pinboard.in/u:bbrown/t:python/t:nlp

In this URL we're saying find user (key=u, value=bbrown) named bbrown and all items that he's tagged (key=t) with both python and nlp. If we swapped the tags around the query produces the same results:

http://pinboard.in/u:bbrown/t:nlp/t:python

All is not perfect, however, in Utopia. If you swap the user param to after the tag param it won't work. This link takes you to a broken page:

http://pinboard.in/t:nlp/t:python/u:bbrown

This is fairly easy to fix and I'm guessing that the guys at pinboard.in don't think that many people will be hacking their url's and so didn't put the time into making them work in any order like true named parameters would.

It's still great to such innovation in the use of the url without using the query param which would have achieved the same result. Out of curiosity I tried:

http://pinboard.in/?u=bbrown&t=nlp

but that doesn't translate for them.

Saturday, January 15, 2011

Blue moons this century

I've been doing some work on calendars and I needed to generate a list of full moons in a given year. This got me wondering when the next blue moon will be and how many blue moons there will be this century. The next blue moon will be on 31 August 2012 at 1:57:45pm UTC and there will be 41 blue moons this century. Here is the code that I used to generate the list of dates:  

DateGenerator dateGenerator = new DateGenerator();
int[] years = Enumerable.Range(2000, 100).ToArray();
HashSet<int> hashSet = new HashSet<int>();
foreach (int year in years)
{
    List<DateTime> fullMoons = dateGenerator.FullMoons(year, 0);
    foreach (DateTime date in fullMoons)
    {
        int yearMonth = date.Year * 100 + date.Month;
        if (hashSet.Contains(yearMonth))
        {
            Console.WriteLine(date.ToString("dd-MMM-yyyy HH:mm:ss"));
        }
        hashSet.Add(yearMonth);
    }
}
Console.ReadLine();

Here is a full list of blue moons for this century. Times are UTC

30-Nov-2001 20:51:10
31-Jul-2004 18:06:05
30-Jun-2007 13:50:25
31-Dec-2009 19:14:20
31-Aug-2012 13:57:45
31-Jul-2015 10:45:46
31-Jan-2018 13:27:46
31-Mar-2018 12:37:46
31-Oct-2020 14:51:29
31-Aug-2023 01:37:05
31-May-2026 08:46:46
31-Dec-2028 16:49:45
30-Sep-2031 18:58:02
31-Jul-2034 05:56:35
31-Jan-2037 14:05:42
31-Mar-2037 09:54:43
31-Oct-2039 22:36:54
31-Aug-2042 02:05:14
30-May-2045 17:53:13
31-Jan-2048 00:16:32
30-Sep-2050 17:32:47
30-Jul-2053 17:08:32
31-Mar-2056 10:26:26
31-Oct-2058 12:54:58
30-Aug-2061 22:20:53
30-May-2064 10:37:03
31-Dec-2066 14:42:26
30-Mar-2067 20:11:05
30-Sep-2069 18:12:11
31-May-2072 22:19:43
30-Apr-2075 18:40:43
31-Oct-2077 10:39:00
31-Jul-2080 19:16:42
31-May-2083 09:43:39
31-Dec-2085 00:00:58
30-Sep-2088 15:28:47
30-Jul-2091 12:03:20
31-Jan-2094 12:40:11
30-Apr-2094 13:58:47
31-Oct-2096 11:20:22
30-Aug-2099 17:59:15