Monday, November 23, 2015

MomentJS Notes


npm i moment --save
npm i moment-timezone --save


var moment = require('moment-timezone');

Create moments in and out of DST:

var cdt ="2015-07-23 08:30", "America/Chicago");
var cst ="2015-11-23 08:30", "America/Chicago");

Check that offset from UTC is what you'd expect:'America/New_York').format()
> '2015-07-23T09:30:00-04:00''America/New_York').format()
> '2015-11-23T09:30:00-05:00'

Check that zone aware format output is what you'd expect:'America/New_York').format('HH:mm:ss')
> '09:30:00''America/New_York').format('HH:mm:ss')
> '09:30:00'

Thursday, November 19, 2015

Node Foundation Membership Election

Mikeal Rogers recently posted Nominations for the 2016 Election for individual representation on the Node Foundation board.

I'm putting myself forward and answering the question "Why would you like to represent the individual membership of the foundation?"


I think that Node.js and the ecosystem around it plays a critical role in our current technology stack. This is going to become more important in the future.

I would like to be able to help shape the success of this platform. One way to do that is to understand what the membership wants and needs to get out of what we have today. More importantly what do they expect from the future. As part of the bridge between membership and the board I will be working to ensure that members' opinions are represented at the leadership level.


Leadership: In the past I've served as a director of engineering for a large IT company managing a team of 35+ managers and developers.

Technical: I have a few open source projects that I manage. I also contribute to open source projects I can improve through pull-requests. My day-to-day work is mostly writing JavaScript against the Node.js platform and system design and architecture.


I've intentionally kept this brief. I'll answer any questions. If you ask them as comments to this post then we can keep them in one place. Anyone else reading this will be informed.

Wednesday, November 18, 2015

Babel Cannot read property error of undefined

This is more a reminder to myself about how to solve this error that I've stumbled across a couple of times with Babel 5 & 6.

If you downgrade Babel from 6 to 5 and leave a .babelrc file in your project with a Babel 6 specific setting like:

  "presets": ["es2015"]

then you might get an error like this:

if (!option) this.log.error("Unknown option: " + alias + "." + key, ReferenceError);

TypeError: Cannot read property 'error' of undefined

The solution is to remove the .babelrc file or remove the Babel 6 specific settings.

Also, if you're using Babel 5 and you require a module from another project that has a .babelrc file which is using Babel 6 and has a Babel 6 setting in it then this will cause the same problem in your Babel 5 project.

Wednesday, November 11, 2015

Docker and GLIBC_2.14 not found

Hit a problem today when I was using Docker and Docker-compose to run a Node.js app in a CentOS container. I had done an "npm install" locally and then the docker-compose command had setup the container and copied everything over from my Ubuntu workstation.

Problem was that I was using a couple of Node.js modules that used native code. So the Ubuntu compile of those modules was getting copied to the CentOS container and when CentOS was trying to run them it was giving me an error about "GLIBC_2.14 not found" because it hadn't compiled those modules.

I feel that a good practise for almost any Docker setup is to have a .dockerignore file which excludes the "node_modules" folder.

This is what my .dockerignore file looks like:


Saturday, October 31, 2015

My Ideal Full Stack JavaScript Team Environment

Here are some of the tools and practices that I'm currently in favor of when working in a team environment on a JavaScript Full Stack project. In no particular order.

Pull Requests

All code should have a second set of eyes on it at some point. This serves the following purposes:

1. Catch bugs or potential bugs.
2. Teach both the reviewer and reviewee.
3. Keeps the team members up-to-date with what other teams members are working on.
4. Introduces best practices, new features/techniques and new libraries.

PRs should almost always be assigned to someone. Others can be @mentioned. Almost anyone (senior enough) should be able to signed off with a Looks Good To Me (LGTM) in the comments and merge.

The developer submitting the PR should let the reviewer know the urgency of the PR.

1. Urgent. The submitter immediately accepts the PR. The review is done post-facto as a learning and review process. Can also catch bugs that are fixed in the next PR.
2. Normal: Someone is expected to review this within an hour.
3. Low: Review within the next day.


An automated linting tool should be part of the commit process. Locking in lint errors to a maximum value will allow the gradual introduction of linting to existing projects.

My default (and currently the mostly popular) linting tool for JavaScript projects is ESLint.

Developers should be encouraged to have a lint helper plugin in their editors so they rarely/never see lint errors on commit.

The team should have an evolving .eslintrc file that's shared across all projects. Some projects might have specific override .eslintrc rules if they are catching up with the standard or have exceptions.

Keep the master .eslintrc in a dot-files folder that's a sibling of all the other projects and reference it from there.


ES6 is here. There are very few reasons not to start using this. Babel is an excellent transpiler that will allow developers to start using those features.

Use code reviews to help teach other team members ES6.

If you're not using ES6 you'll start leaking developers to other teams and companies. More on this below...

Hot Loading

Most projects should have tooling setup so that they can quickly iterate on small changes to code that they are working on and see the results.

This rapidly accelerates both learning and development.

If, for example, you're working on a React/Node project then, in my opinion, you'd have the Webpack Hot Module Replacement transpiling your changes in memory and making them almost instantly available in the browser.

As developers we're always on a learning curve. We can never be expected to know everything. Using a fast responsive hot loader we can quickly try different techniques and see the results. This will speed our learning of how that corner of the world works.

Front End Framework

Most projects should be using a modern front-end framework for the UI component. The most popular over the last several years has been AngularJS. The current rising start is ReactJS.

I'm in favor of ReactJS. It looks like AngularJS and EmberJS are converging on what ReactJS has introduced.

Latest Versions

We should strive, as much as possible, to be using the latest versions of everything. It's perfectly acceptable to hold off on the brightest and shiniest until it's stable. That should be a conscious decision and not the default.

Npm has a great command npm outdated --depth 0 that will give us a summary of how our project looks compared to what's available. Use that command with the -g option to check your globally installed tooling.

The 3 main reasons usually given for keeping up with the latest version (1. Features 2. Performance 3. Bug Fixes) are completely irrelevant. The reason you want to stay current is to keep your developers happy and engaged. Nobody wants to work on yesterday's technology.

Keeping a buzz in the team about moving to the next version will keep everyone up-to-date on what's happening and engaged with the team. The grass-is-greener syndrome with other teams or companies won't exist on the technology front because you'll be using the latest. In fact I've successfully used this as a recruiting tool for developers who are stuck in other companies or teams on old technology.


Your tooling should automatically keep your project's dependencies up-to-date with what would be installed when someone initially installs the dependencies on the project. I use the npm-update-outdated module as part of my pre-commit hook to update my dependencies before the tests are run so I know that my tests have run against what the CI server will use.

Project Onboarding

If someone asks you to work on a project then you should be able to do the following pretty quickly.

1. git clone
2. npm install
2. npm test (should run clean)
4. npm start

And have a server up and running that the browser can hit and interact with.

There are lots of complexities that might stop something that simple from working.

The project's readme should have bullet proof (as good as possible) instructions on how to address that.

For example, most developers have Docker. If you have a dependency on a MongoDB then npm start or npm install should run Docker and pull down a MongoDB container and get it up-and-running.

A new-to-the-project developer who experiences problems with an out-of-date and who then updates the should be given the rest of the day off as a reward. (That might be a bit extreme. This almost never happens so it might also be okay.)


All projects should have tests. These should run during a pre-commit hook. Ideally these are run implicitly through the Code Coverage tool.

When bugs are fixed a test should usually be written demonstrating the bug before it's fixed. This test will prevent a regression on this bug later.

Tests are difficult to write. Do lots of cross-education with other team members during PRs with tests. Get to know and love all the tooling around tests. When you're using a tool for the first time ask for help and opinion in the PR.

Some of the tools I use for testing:

Code Coverage

Code coverage such as Istanbul can be automated as part of a pre-commit hook. It will help make sure that critical sections or sections that you're about to change have sanity tests. It has awesome "graphics"

Code Coverage Minimum Levels

It's easy for a developer to commit new code that doesn't have test coverage. In my enthusiasm to get my new code out there I often forget to write tests for it. Using a tool like Istanbul we can prevent code coverage from dropping from the previous level in a pre-commit hook. This will remind us that we need to add tests.

Update 11/2/15


I didn't used to think that isomorphic JavaScript was that important. I believe that it is now becoming important.

Least of all you should never have duplicate code that's duplicated for the sake of using on both server and client. There is almost always a way to provide the same code block to both client and server to provide a DRY and isomorphic code-base.

Tuesday, September 22, 2015

ReactJS JSX gives us HTML in JavaScript

What I consider the main difference between ReactJS and its JSX syntax when compared to the other frameworks like AngularJS, EmberJS, and BackboneJS is that JSX is HTML in JavaScript. The others have JavaScript in the HTML.

HTML-in-Code instead of Code-in-HTML.

Here's a simple example:

React's JSX: (item) => {

  return render (





<div ng-repeat="item in myCollection">



One of the things that we continuously struggle with is the ability to program markup. To do this we've been adding language to the markup.

JSX stands that paradigm on its head and adds the markup to the language.

As a developer I enjoy writing code more than writing markup. JSX means that I don't have to jump-out-of the markup to make things happen to it. The markup is encapsulated inside the logic that controls the flow of the application.

With AngularJS I'm consciously wiring language and markup together.

AngularJS appeals to the designer in me while JSX appeals to the developer in me.

I'm not against AngularJS or the other frameworks. I enjoy writing code and solving hard problems. I also hated the sight of JSX when I first saw it and before I wrote my first project in it. It didn't seem right to pollute JavaScript with markup.

For me the AHA moment was the productivity I felt that React gave me by keeping me in my area of expertise: JavaScript. I'm good at HTML and CSS. I'm an expert at JavaScript. More productivity happens in your area of expertise.

Long live ReactJS.
Long live AngularJS.

Monday, September 21, 2015

Thousands of Lint Warnings and Errors

You add linting to your project and suddenly you're faced with thousands of lint warnings and errors. What's the point? You're never going spend those hours upon hours fixing all of this. Or are you?

I use ESLint for linting on JavaScript projects.

If I start a project I add a set of linting rules (in the .eslintrc file) and an automated way of running them with tests or checkins and the project remains lint free. That's easy.

Adding lint rules to an established project is tough, especially if you have a set of predefined rules you want to use and they are generating more warnings and errors than you can address.

The best way, I've found, for adding a linter in this case is to:
  1. Add all your lint rules using the .eslintrc file.
  2. Run the linter and then disable each rule that is generating a warning or error until there are no more warnings or errors.
Now go back to coding on your project and when you hit the wall turn to enabling a lint rule as your stress relief.

We all have times when things are not going well in the code base and we need to take a break to refresh, regroup and take another approach at what we're doing.

These breaks are perfect times to see if we can switch one of those disabled lint rules to a warning or error level and improve the quality of the code. It's also a great opportunity to scan over the entire code-base and re-familiarize yourself with parts that you'd forgotten about or parts that might be dead and can be removed.

Saturday, September 19, 2015

Consuming a JSON REST result in C# .NET

Yesterday I wrote about an Simple API server for IP Address Information Lookup that I had created in Node.js. This server produces a JSON result that looks like this for IP address

         "en":"Mountain View",
         "en":"North America",
         "en":"United States",
         "en":"United States",

I wanted to consume this end-point in a C# application. A quick search for solutions on how to consume a JSON result from a REST service showed some very complicated ways of doing this. This is a simple solution that I came up with:

// A simple C# class to get the details of an IP address
public class IpApi
    public IPInfo GetIPInfo(string ip)
        // Create the URL for the REST endpoint
        string url = "http://localhost:3000/ip/" + ip;
        // Create a .Net WebClient object
        WebClient wc = new WebClient();
        // Get the string result from the REST endpoint
        string json = wc.DownloadString(url);
        // Use Newtonsoft JsonConvert class and its static
        // DeserializeObject method to parse the string into
        // a .Net object
        return JsonConvert.DeserializeObject<IPInfo>(json);
// The following classes create the JSON object graph hierarchy
// in static classes. It's a bit tedious looking at the JSON
// object and typing these out. I'd imagine that if you're 
// doing this a lot you could write a tool to generate these
// classes using the JSON as input. Or perhaps there's already
// a tool that does this?
public class Names
    // I didn't need the pt-BR and zh-CN key/values from the
    // names so I didn't bother looking up how JsonCovert maps
    // those. 
    public string de, en, fr, ja, ru;
public class City
    public Names names;
    public int geoname_id;
public class Continent
    public string code;
    public Names names;
    public int geoname_id;
public class Country
    public string iso_code;
    public Names names;
    public int geoname_id;
public class Location
    public decimal latitude;
    public decimal longitude;
    public int metro_code;
    public string time_zone;
public class Postal
    public string code;

public class IPInfo
    public City city;
    public Continent continent;
    public Country country;
    public Location location;
    public Postal postal;
    public Country registered_country;
    public Country[] subdivisions;

Friday, September 18, 2015

Simple API server for IP Address Information Lookup

I find that I often have the need, in a web application, to lookup information about an IP address that a request comes from.

To serve this purpose I created IPAPI (GitHub), a simple server providing a single API endpoint that returns information about an IP address.

(You should try the server just because the name is a palindrome. Imagine the awe and respect you'll get when you say "I ran a Palindrome API server today.")

The server runs on Node.js and uses the GeoLite2 database from MaxMind.

You can install and run the server with:

git clone [email protected]:guyellis/ipapi.git
npm install
node index.js


npm install ipapi
node node_modules/ipapi/index.js

The server will download and unzip the IP database if it's not found locally.

Using the server is as simple as having your client call http://<your-domain/ip/<x.x.x.x> and getting back JSON with information about the IP.

There are plenty of improvements that can made to this simple server. Pull Requests gladly accepted.

Thursday, September 17, 2015

Improving Security in Node.js

A collection of tips and ideas to improve the security of a Node.js application.

Don't let your app identify itself as a Node.js application

In the response header you might have:
X-Powered-By: Express
Set-Cookie: connect.sid: sJbhAJcKt1JVuCRZ5HwpYMhFBAKaXm0

The first item identifies that your application is using Express.js which identifies Node. The second in a similar manner identifies the Connect middleware for session management, again a Node module.

The fixes for these are very simple. To suppress the X-Powered-By header key/value you should do this:

app = express();

For the connect.sid identifier you can change the default name by using the "key" key in the initialization object:

  key: '<customize me>',

Wednesday, September 16, 2015

No Website Needed

I have a friend who's an architect. He does bespoke designs for shops and houses. He's never had a website and he says that's one of his unique-selling-points (USP) when talking to a customer.

Apparently the conversation goes something like this:

Customer: "I searched the web and I couldn't find your website or anything else about you."
Architect: "The only reason that we're talking now is because someone, probably an existing client, recommended me to you. I only do work through referrals and I find that I have more work than I can cope with."

In his response he's implying that he's that good that he doesn't have to advertise. According to him that's a very powerful sales pitch during the initial consultation with a future client. A website, he feels, would not allow him to use that line in closing a sale.

Tuesday, September 15, 2015

Moving your Team and Project to ES6

How do you safely move your team to using ES6?

1. Convince the team/boss/stakeholders that it's time to move to ES6

The points in this article might help.

2. Decide what you're going to use

Are you going to use a transpiler like Babel or a shim? If using Node.js are you going to use the --harmony flag? There are plenty of articles that discuss this.

3. Start using ES6

Most teams will have at least one person who wants to use the new syntax. This is how these developers should introduce ES6 to the rest of the team.

Training? That's one way. Another is to cross-train each other on a code base that you're already working on.

Let's say that you wrote some ES5 code like this:

handleChange(name, e) {
  var change = {};
  change[name] =;

and you decided that you'd rather use ES6 syntax like this:

handleChange(name, e) {
  var change = {

The better way to use the ES6 syntax is to tag it with the name of the new feature like this:

handleChange(name, e) {
  var change = {
    // ES6: Destructuring, computed property keys

A developer looking at the ES6 comment should be able to copy/paste the text into a search engine and find a reasonable list of articles that deal with the feature.

4. Teach in the Pull Request Review

Now when you do a Pull Request on your changes message the rest of the team that there are labeled ES6 features.

I've found that this has the extra bonus of having more developers want to review your code because they know that they're going to be learning something new about ES6 during the review process.

Your team are already contributing to and editing a code-base that they're familiar with so the ES6 usage that they're learning is not contrived. It's real world.

5. Remove the ES6 comments

The comments labeled ES6: are there for training purposes. After a period of time you may want to remove them. This is easy to do in a search and delete of this type of comment in the code base.

If you have a lot of non-ES6 developers continually joining the project then you might not want to do this. You can also tag the repository before removing the comments so others can checkout that tag and read the comments.

You might also elect to start dropping comments for the easy-to-understand and common ES6 features and keep them in for the more complex scenarios.

Wednesday, March 25, 2015

Should I use Extend or Assign in Lodash?



Use _.assign()



  1. If you look at the lodash documentation for _.assign() you'll see that _.extend() is an alias for _.assign(), not the other way around. This would imply that _.assign() is the default and _.extend() is an alternative.
  2. ECMAScript 6 will use assign() and not extend().