debuggable

 
Contact Us
 

Releasing node-mysql 2.0.0-alpha

Posted on 15/5/12 by Felix Geisendörfer

Today I am releasing an alpha version of node-mysql v2.0. If you are using v0.9.x at this point, I highly encourage you to try it out, as now is your best chance to influence the API and features of the final release.

To install the new version, do:

npm install mysql@2.0.0-alpha

Then check out the Upgrading Guide and adjust your code as needed.

After that make sure to join the new mailing list or #node-mysql IRC channel to provide any feedback you may have.

This new version comes with a few exciting improvements:

  • ~5x faster than v0.9.x for parsing query results
  • Support for pause() / resume() (for streaming rows)
  • Support for multiple statement queries
  • Support for stored procedures
  • Support for transactions
  • Support for binary columns (as blobs)
  • Consistent & well documented error handling
  • A new Connection class that has well defined semantics (unlike the old Client class).
  • Convenient escaping of objects / arrays that allows for simpler query construction
  • A significantly simpler code base
  • Many bug fixes & other small improvements (Closed 62 out of 66 GitHub issues)

I have been working on this new version for quite some time, but only now was able to complete the final work thanks to my amazing new sponsors:

For those of you interested in the future of v0.9.x:

  • There will be no v1.0 release, the way 0.9.x handles its queue and reconnect mechanism are broken and not easily fixable without breaking BC.
  • I will merge critical bug fixes that come with tests and don't break BC.
  • There will be no major changes, especially features.

Going forward, I also hope to find some time to write about:

  • The new parser and what makes v2 so fast
  • The choice of rewriting vs. refactoring (I tried both)
  • My failed first attempt for a new parser design

--fg

 

How to write jQuery plugins

Posted on 29/3/12 by Tim Koschützki

jQuery, the most popular javascript library out there, is great for DOM abstraction. It allows you to encapsulate functionality into your own plugins, which is a great way to write reusable code. However, jQuery's rules for writing plugins are very loose, which leads to different plugin development practices - some of which are pretty poor.

With this article I want to provide a simple plugin development pattern that will work in many situations. If the functionality you would like to encapsulate is large and really complex, jQuery plugins are probably not what you should use in the first place. You'd rather use something like BackboneJS or jQuery.Controller in this case.

If you can't or don't want to use Backbone, you might still get away with my solution ...

Starting off

;(function($, doc, win) {
  "use strict";

  // plugin code will come here

})(jQuery, document, window);

The semi-colon before the function invocation keeps the plugin from breaking if our plugin is concatenated with other scripts that are not closed properly.

"use strict"; puts our code into strict mode, which catches some common coding problems by throwing exceptions, prevents/throws errors when relatively "unsafe" actions are taken and disables Javascript features that are confusing or poorly thought out. To read about this in detail, please check ECMAScript 5 Strict Mode, JSON, and More by John Resig.

Wrapping the jQuery object into the dollar sign via a closure avoids conflicts with other libraries that also use the dollar sign as an abbreviation. window and document are passed through as local variables rather than as globals, because this speeds up the resolution process and can be more efficiently minified.

Invoking our plugin

;(function($, doc, win) {
  "use strict";

  function Widget(el, opts) {
    this.$el  = $(el);
    this.opts = opts;

    this.init();
  }

  Widget.prototype.init = function() {

  };

  $.fn.widget = function(opts) {
    return this.each(function() {
      new Widget(this, opts);
    });
  };

})(jQuery, document, window);

$('#mywidget').widget({optionA: 'a', optionB: 'b'});

We invoke our plugin on a jQuery object or jQuery set by simply calling our widget() method on it and pass it some options. Never forget about "return this.each(function() { ... })" in order to not break the chain-ability of jQuery objects.

The main functionality of the plugin is encapsulated into a separate Widget class, which we instantiate for each member in our jQuery set. Now all functionality is encapsulated in these wrapper objects. The constructor is designed to just keep track of the passed options and the DOM element that the widget was initialized on.

You could also keep track of more sub-elements here to avoid having to always .find() them (think of performance) as you need them:

;(function($, doc, win) {
  "use strict";

  function Widget(el, opts) {
    this.$el     = $(el);
    this.opts    = opts;

    this.$header = this.$el.find('.header');
    this.$body   = this.$el.find('.body');

    this.init();
  }

  // ...
})(jQuery, document, window);

Parsing options

When we invoked the plugin we passed it some options. Often you need default options that you want to extend. This is how we bring the two together in our object's init() method:

;(function($, doc, win) {
  "use strict";

  function Widget(el, opts) {
    this.$el  = $(el);

    this.defaults = {
      optionA: 'someOption',
      optionB: 'someOtherOption'
    };

    var meta  = this.$el.data('widget-plugin-opts');
    this.opts = $.extend(this.defaults, opts, meta);

    // ...
  }

  // ...
})(jQuery, document, window);

$('#mywidget').widget({optionA: 'a', optionB: 'b'});

I like keeping the default options within the constructor of the wrapper class and not outside of it. This provides the flexibility to just take the whole wrapping class and copy it to somewhere else where you might not even have jQuery available.

If the element has options for us saved within its data attributes, we also want to take them into account. This is handy for when you have plugins that auto-initialize themselves (which we will do later) so that you have no way to pass options to them via their plugin invocation.

Here is an example:

<div class="widget js-widget" data-widget-plugin-opts="{"optionA":"someCoolOptionString"}">

Optional: Keeping a reference to our wrapper object in the element

It's a good idea to keep a reference to our plugin object on the DOM element, because that helps a lot with debugging using your browser's javascript console later. I don't always do this because it may just be overkill for the situation at hand. But I want to show you how to do this regardless:

;(function($, doc, win) {
  "use strict";

  var name = 'js-widget';

  function Widget(el, opts) {
    this.$el  = $(el);

    this.defaults = {
      optionA: 'someOption',
      optionB: 'someOtherOption'
    };

    // let's use our name variable here as well for our meta options
    var meta  = this.$el.data(name + '-opts');
    this.opts = $.extend(this.defaults, opts, meta);

    this.$el.data(name, this);

    // ...
  }

  // ...
})(jQuery, document, window);

$('#mywidget').widget({optionA: 'a', optionB: 'b'});

console.log($('#mywidget').data('js-widget'));

As you can see, we just expose our wrapper object using jQuery's $.data function. Easy.

Binding some events

Let's use our wrapper object's init() function to bind some events and write some real plugin code:

;(function($, doc, win) {
  "use strict";

  var name = 'js-widget';

  function Widget(el, opts) {
    this.$el      = $(el);

    this.defaults = {
      optionA: 'someOption',
      optionB: 'someOtherOption'
    };

    var meta     = this.$el.data(name + '-opts');
    this.opts    = $.extend(this.defaults, opts, meta);

    this.$el.data(name, this);

    this.$header = this.$el.find('.header');
    this.$body   = this.$el.find('.body');
  }

  Widget.prototype.init = function() {
    var self = this;

    this.$header.on('click.' + name, '.title', function(e) {
      e.preventDefault();

      self.editTitle();
    });

    this.$header.on('change.' + name, 'select', function(e) {
      e.preventDefault();

      self.saveTitle();
    });
  };

  Widget.prototype.editTitle = function() {
    this.$header.addClass('editing');
  };

  Widget.prototype.saveTitle = function() {
    var val = this.$header.find('.title').val();
    // save val to database

    this.$header.removeClass('editing');
  };

  // ...
})(jQuery, document, window);

Notice that we have bound the events via .on() in delegation mode, which means that our .title element doesn't even have to be in the DOM yet when the events are bound. It's generally good practice to use event delegation, as you do not have to constantly bind/unbind events as elements are added/removed to/from the DOM.

We use our name variable as an event namespace here, which allows easy unbinding later without removing event listeners on the widget elements that were not bound using our plugin.

How to prevent designers from breaking your plugins

Something else that I like doing is attaching different classes to elements depending on if they are meant for css styling or for javascript functionality.

The markup that we could use for the plugin above could be:

<div class="widget">
  <div class="header"></div>
  <div class="body"></div>
</div>

And then we do:

$(function() {
  $('.widget').widget();
});

This is all fine and dandy, but now our designer comes along, changes the classes around, because he is not aware of our new plugin (or worse, he doesn't even care). This would break our client javascript functionality. So what I often do is this instead:

<div class="widget js-widget">
  <div class="header js-header"></div>
  <div class="body js-body"></div>
</div>

Then in our plugin change how we find the body and header:

function Widget(el, opts) {
  this.$el     = $(el);

  this.defaults = {
    optionA: 'someOption',
    optionB: 'someOtherOption'
  };

  var meta     = this.$el.data(name + '-opts');
  this.opts    = $.extend(this.defaults, opts, meta);

  this.$header = this.$el.find('.js-header');
  this.$body   = this.$el.find('.js-body');
}

And change our invocation:

$(function() {
  $('.js-widget').widget();
});

I agree it's a little more markup to write and you might not need to use this. Also, since we manipulate the DOM with jQuery plugins we might depend on specific tag types anyway. So if the designer changed all div's to be li's instead, it might still break our plugin. If you are in a situation where you have regressions due to frontend engineers and designers not communicating properly, using js- prefixed classes on all important elements might be a step in the right direction.

Notice how this.$header and this.$body are also agnostic to the html tag of the element that they cover.

How to remove our plugin without removing the DOM element

For large applications it's important to allow multiple plugins to operate on the same elements. For this to work, you need to be able to add and remove plugins on the same element without affecting the other plugins.

Most jQuery plugins expect you to remove the element entirely to teardown the plugin. But what if you want to remove the plugin without removing the element? We can do this using a destroy function:

Widget.prototype.destroy = function() {
  this.$el.off('.' + name);
  this.$el.find('*').off('.' + name);

  this.$el.removeData(name);
  this.$el = null;
};

It takes our local name variable again and removes all events in that namespace. It also removes the reference to the wrapper object from the element. Now we can easily remove the plugin from the outside:

$('.js-widget').data('js-widget').destroy();

By the way, if you remove the DOM element, jQuery will take care of removing all associated data and events by itself, so there is no need to worry about that case.

How to write self-initializing plugins

If you need to deal with a lot of Ajax requests in your app and then need to bind plugins on the DOM elements that were just loaded, this tip might be pretty useful for you.

What I like doing is using a PubSub implementation to automatically invoke plugins:

$(function() {
  var $document = $(document);
  $document.trigger('domloaded', $document);

  $('.some
selector').load('/my/url', function(nodes) {
    $document.trigger('ajax_loaded', nodes);
  });
});

Now we can allow the plugin to bind itself to all elements that by class definition need to be bound to it:

;(function($, doc, win) {
  "use strict";

  var name = 'js-widget';

  // wrapper object implementation, etc.

  $(doc).on('domloaded ajaxloaded', function(nodes) {
    var $nodes = $(nodes);

    var $elements = $nodes.find('.' + name);
    $elements = $elements.add($nodes.filter('.' + name));
    $elements.widget();
  });
})(jQuery, document, window);

You can also come up with your own very custom events and even namespaces to allow your plugins to talk to each other without having to know about each other.

Advantages of this:

  1. This removes a lot of boilerplate code from our app! Re-initializing plugins after an ajax request without extra codelines? No problem!

  2. We can simply remove functionality from our application by not loading a specific plugin's javascript file.

I can see two obvious disadvantages here, though:

  1. We cannot provide options to the plugin invocation via this. We'd have to rely on options bound using the html5 data property "data-js-widget-opts" (read above). In my experience this not as often needed as one would think, though.

  2. If you have a very complex app with a lot of plugins and code flying around, this PubSub mechanism might not be the most performant way of doing things. Think of 20 plugins all doing some .find() and .filter() operation on a large piece of markup that was just loaded via ajax. Ugh. :)

Conclusion

Wrapping usable code in a jQuery plugin is not always easy, but following a few guidelines makes it a much better experience. The ultimate takeaway here is to always wrap the plugin functionality in a wrapper class, so that the elements in your jQuery set that you bind the plugin to do not interfer with each other.

If your plugin is more complex, you could even use multiple wrapper classes and let their objects talk to each other. Or even cooler, try to move some of the functionality your plugin requires out into another, smaller plugin. And let the both of them talk to each other via PubSub.

The rest is a couple nice extras that made my live easier.

Here is the full skeleton again that I use when I write new plugins (rename accordingly):

;(function($, doc, win) {
  "use strict";

  var name = 'js-widget';

  function Widget(el, opts) {
    this.$el      = $(el);
    this.$el.data(name, this);

    this.defaults = {};

    var meta      = this.$el.data(name + '-opts');
    this.opts     = $.extend(this.defaults, opts, meta);

    this.init();
  }

  Widget.prototype.init = function() {
  };

  Widget.prototype.destroy = function() {
    this.$el.off('.' + name);
    this.$el.find('*').off('.' + name);
    this.$el.removeData(name);
    this.$el = null;
  };

  $.fn.widget = function(opts) {
    return this.each(function() {
      new Widget(this, opts);
    });
  };

  $(doc).on('dom_loaded ajax_loaded', function(e, nodes) {
    var $nodes = $(nodes);
    var $elements = $nodes.find('.' + name);
    $elements = $elements.add($nodes.filter('.' + name));

    $elements.widget();
  });
})(jQuery, document, window);

Kind regards, Tim

@tim_kos

 

Vim Workshop in Berlin (April 20)

Posted on 26/3/12 by Felix Geisendörfer

My friend Drew of Vimcast fame is organizing two half-day vim workshops in Berlin on April 20.

As a former Textmate user, I cannot overstate the productivity gains from mastering vim. With the early bird discount, the tickets sell at 75 GBP (~90 EUR), and there are only a few tickets left, so you should act quickly.

The workshops are aimed at intermediate users, so if your vim skills are non-existing or very rusty, you should probably play with vimtutor before showing up.

I'll be attending the afternoon workshop along with Tim, so hope to see you there!

--fg

Full Disclosure: I gain nothing by promoting this event other than the joy of seeing people boost their productivity.

 

NPM - An intervention

Posted on 22/2/12 by Felix Geisendörfer

Update: Isaac commented and explained why fuzzy version specifiers are here to stay. I'll be ok with it and will adapt my workflow accordingly.

Update 2: I did not give up on the bug that is part of the story below, a test case and fix has been submitted and merged!

Update 3: NPM Shrinkwrap is now a real thing.

NPM is the official node.js package manager. Unlike many package managers that came before, it is actually incredibly awesome, and has helped to create one of the most vibrant communities in the history of open source.

However, today I want to talk about a few aspects of npm that concern me. In particular I want to talk about stuff where I feel that NPM is making bad things easy, and good things hard.

NPM module versions are broken

Today, I tried to contribute to the forever module. The company I am helping had to patch their version of it because of a hard-to-reproduce bug in production and asked me to help submitting their fix upstream. Being the scientific type, I set out to write a test case against the forever version their patch is based on:

$ npm install forever@0.7.2

Fantastic, NPM lets me specify which version of forever I want to install. Now lets verify the installed version works:

$ ./node_modules/forever/bin/forever

node.js:134
        throw e; // process.nextTick error, or 'error' event on first tick
        ^
TypeError: undefined is not a function
    at CALL_NON_FUNCTION_AS_CONSTRUCTOR (native)
    at Object. (/Users/Felix/Desktop/foo/node_modules/forever/lib/forever.js:43:23)
    ...

Oh no, what happened? Mind you, except for an unrelated patch, this version of forever is running perfectly fine in production.

Well, as it turns out, you have been lied to. There is no such thing as forever v0.7.2. At least not a single one. It depends on an implicit and unchangable second parameter: time.

Why is that? Well, it is because forever v0.7.2 depends on this:

"nconf": "0.x.x",

And as it turns out, nconf has released newer versions matching this selector, featuring a different API.

You are doing it wrong

"Hah!", you might say. "That's why you should check your node_modules into git!".

I am sorry, but that is not helpful. While this will allow me to pin down the node modules used by my app exactly, it does not help me here. What I want to do is to reproduce this bug in a standalone copy of forever v0.7.2, then check if it exists in the latest version, and if so submit the test case and fix for it upstream.

However, I can't. Not without manually resolving all forever dependencies the way NPM resolved them when v0.7.2 was released. (The fact that forever is a bit of a spaceship when it comes to dependencies does not help either).

Discouraging Open Source

Speaking about Mikeal's article. I felt that something was wrong about checking your node_modules into git when reading it, but it is only now that I can point out what:

In the article, Mikeal argues that module authors should not try to exactly reference their dependency versions, so this way users would get more frequent updates of those dependencies and help test them.

However, he says doing so for your app is a good thing.

I disagree. To me, this approach discourages open source for two reasons:

a) Bug reports:

I currently maintain 44 NPM modules. It is very hard to keep up with that.

If you are asking me to support multiple versions of all my dependencies, I will have to stop helping people with bug reports for my modules.

When somebody reports a bug for a given version of my module, I want to know exactly what version he used. Figuring out when he installed my module to rule out dependency issues for every bug report is not an option for me.

b) Contributions

Ask yourself what is easier. Adding a quick patch to a node module you already track include in the git repo of your app, --or-- creating a fork of it, fixing the problem in the fork, pushing that fork on GitHub, changing your package.json to point to your fork, and submitting a pull request.

I know people cannot be forced to contribute back, nor should they be. But as things stand right now, checking in all node_modules of an app into git is the only sane option, as the version numbers in your package.json are essentially meaningless.

This means that contributing back to open source is made difficult by default, while keeping your patches to yourself is made easy. I would like this to be the other way arround.

Conclusion

I propose to gradually drop all support for fuzzy version specifiers from NPM.

To me, fuzzy version specifiers are entirely evil. They make things more complex. They force me to manually snapshot the packages I depend on for my apps. They prevent me from supporting and contributing to open source.

So rather than throwing more complexity at this problem, lets just remove this feature alltogether.

If you agree, please re-tweet this article or leave a comment.

--fg

 

Testing node.js modules with Travis CI

Posted on 18/11/11 by Felix Geisendörfer

You have written a node.js module lately? It has a test suite? Awesome! Time to get yourself a nerd badge of honor:

build passing

But hang on nerdy warrior, this precious award has to be earned. So go ahead and check out the sweetness that is Travis CI. Travis is an open source, free to use, continuous integration server. Initially it was just building ruby stuff, but these days it supports a ton of other languages, including node.js.

And luckily, getting travis to run your tests on every GitHub push is really easy as well:

Step 1: Go to Travis and login/connect with your GitHub account.

Step 2: Hover over your name on the top right, and select "Profile" from the dropdown.

Step 3: You should see all your GitHub projects. Flip the "Off" switch to "On" for a node.js project you want to use with travis.

Step 4: Add a .travis.yml file to your project with the following:

language: node_js
node_js:
  - 0.4
  - 0.6

Step 5: Make sure your package.json has something like this:

"scripts": {
    "test": "make test"
  },

Step 6: Git push, and watch travis building your project on the home screen!

Step 7: Assuming your tests are passing, it is time to get your badge of honor. Adding it to your GitHub Readme.md is as simple as:

[![Build Status](https://secure.travis-ci.org/<GITHUB_USER>/<REPO_NAME>.png)](http://travis-ci.org/<GITHUB_USER>/<REPO_NAME>)

If you want to see an example of what this looks like, and you also happen to be in the market for some no-bullshit testing tools, check out my new libs:

  • utest: The minimal unit testing library.
  • urun: The minimal test runner.

That's it. And in case you are not excited enough yet, go and check out the Travis Docs to discover additional goodies like how to work with databases, etc.

--fg

 
« Previous
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9