I just upgraded my Node from 4.2.1
to 4.2.4
and I saw there are two points in this process which are worth sharing. First why I upgraded to 4.2.4 and not v5.0 and second which sounds funny to me that on my local machine I use nvm
but in my production servers I still use n
because it sounds more reliable. Please correct me if you think I am mistaken about that.
##Why 4.2.4 and not 5.x? I listen to a Node.js podcast called NodeUP which is highly recommended. In its latest episode (96 - A Node v5.0 Show) one of the panelists asks Rod Vagg and Rebecca Turner both from the main npm and node team: > Should I be running or Node v5.0 or should I be running LTS 4.2.x? How do I figure that out?
and both had a similar response, in a nutshell:
Day to day I am using Node v5.0 but if I were deploying a server I would use the LTS(4.2.4) just because having that stability. - Rebecca
This is exactly why I upgraded to 4.2.4 instead of 5.0 but to make more sense of it you have to listen to whole answer in the podcast itself: NodeUp 96. It is awesome.
##Using NVM
Using nvm
makes the whole process much easier but it also has its own issues, for instance in linux systems one of the famous issues of using nvm
is the issue when using sudo
to install global packages which could be fixed easily but still is an issue.
To install nvm
you could follow the instruction in its Github repository, which has a install script the you could easily download using curl
or wget
and run with bash
.
Now to upgrade or to install node using nvm
all you should do is: nvm install 4.2.4
and then nvm use 4.2.4
.
That is how I upgraded my node on my local machine.
##Using N
I really like nvm
BUT I used n
on my production servers which to me sounds more reliable and has a much easier installation process and so far I haven’t seen any of the issues I have had with nvm
. That’s how I upgraded my servers:
sudo npm cache clean -f
sudo npm install -g n
sudo n 4.2.4
sudo ln -sf /usr/local/n/versions/node/4.2.4/bin/node /usr/bin/node
and it was all I should do. Of course next time I no longer need to install n
itself.
##What about npm 3? Using npm 3, your dependencies will be installed flat - by default, and there is no doubt that, IMHO, it is a huge improvement. The great thing about npm that I really like is how easy I could upgrade it using its own install command:
sudo npm install -g npm
and done.
Click on the post title to leave your comments (powered by Disqus)
After a while eventually DefineJS comes with a bunch of new features to make it easy to use es6 generators along with promises. Being able to pass a function generator to the promise chain is one of the cool features that I really like now in DefineJS. Even IIFEs could disappear now, using this new feature and I my self startd it like this(Not relevant but it is worth noting that AsyncDB is an async local data storage based):
define.Promise.resolve(jQuery)
.then(function * ($) {
var db = {},
AsyncDB = yield require('asyncdb'),
pkg = yield $.getJSON('package.json');
AsyncDB.new('packages', db);
var pageContent = yield $.get(pkg.repository.url);
$(pageContent).appendTo('.container');
var packageId = yield db.packages.insert({
name: pkg.name,
version: pkg.version
});
return db;
})
.then(function * (db) {
// a totally private and dedicated scope
// which has access only to the db object
var packages = yield db.packages.find();
// This way we could make different scopes
// with different access levels
});
A working example of the new feature is now ready to ckeck out in examples folder: define-promise-dev. The code block below shows how it si easy now to setup an application lifecycle.
config.go()
.then(firstPhase)
.then(secondPhase)
.then(finalPhase)
.catch(lifecycleInterruption)
.done(theEnd);
function * sameLifecycle() {
var message;
try {
var packageInfo = yield config.go();
var app = yield firstPhase.go(packageInfo);
var shimModule2 = yield secondPhase.go(app);
message = yield finalPhase.go(shimModule2);
} catch (err) {
message = yield lifecycleInterruption.go(err);
}
theEnd(message);
}
Take a thorough look at the two code block above. They both do the exact same thing without us needing to create IIFEs or to use callbacks.
Of course it still has new freatures comming, very soon.
Click on the post title to leave your comments (powered by Disqus)
##DefineJS v0.2.4 Released
This hybrid CommonJS/AMD format allows to write modules with a new syntax similar to CommonJS. This feature is now possible thanks to the ES6 generators.
Let’s imagine a CommonJS module like:
//app.js
var utils = require('utils'),
$ = require('../vendor/jquery');
var app = {
//...
};
module.exports = app;
The DefineJS alternative is:
//app.js
define(function* (exports, module) {
var utils = yield require('utils'),
$ = yield require('../vendor/jquery');
var app = {
//...
};
module.exports = app;
});
As mentioned the new syntax is similar to the CommonJS coding style, with two specific differences. First the yield
keyword and the next is the define
wrapper with a ES6 function generator
.
Click on the post title to leave your comments (powered by Disqus)
##DefineJS v0.2.3 Released DefineJS v0.2.3 now offers an asynchronous but totally synchronous looking way of requiring dependencies using ES6 generators syntax, it still has a lot of possible implications but for now it could be useful if you are either using any of the modern browsers with ES6 generators syntax support, or any ES6 generators transpiler.
This is how it looks like to define a new module:
//app.js
define(function* () {
var _,
app;
//conditional dependency lodash or underscore
if(loadashIsNeeded){
_ = yield require('../vendor/lodash');
} else {
_ = yield require('../vendor/underscore');
}
app = {
//...
};
return app;
});
and to require it you could do the old AMD way:
//main.js
require(['app'],
function (app) {
app.lunch();
});
Or the new way using ES6 generators syntax:
//main.js
require(function* () {
var app = yield require('app');
app.lunch();
});
Which to me is much cleaner code in compared with passing dependencies as parameters. If you take a short look at the code you would see that, because of the synchronous style of coding you could even have conditional dependencies.
It is worth noting that being able to have conditional dependencies makes the whole system more dynamic but on the other hand it goes through a totally different path of loading a module. For instance one of the possible debates around this approach would be the way we could concatenate this type of modules when we are not sure about their dependencies. Although I believe this type of issues and debates happen when you cross the border and com up with a new way of thinking.
Click on the post title to leave your comments (powered by Disqus)
The new version with a whole bunch of examples and couple of new features, is ready to use. DefineJS now offers Promissed Modules
and use()
syntax.
##Promised Modules
Using the same AMD module style you can have privileged promise based modules.
All you need to do is just returning a promise in your modules, to make them promised modules.
To see how it works, just check out the simple-promised-module example
in the examples folder.
In this example we have a promised module named: promisedModule.js which is responsible to wait for a specific global variable, and serves it as the module’s promised value.
define([ /*'dependency'*/ ], function ( /*dependency*/ ) {
return new Promise(function (fulfill, reject) {
// Here you expect to have a global variable named:
// myApp after 2 seconds
// otherwise your module definition gets rejected
setTimeout(function () {
if (window.myApp !== undefined) {
// fulfill when succeeded and pass the fulfillment value
fulfill({
app: window.myApp,
log: 'This is just a sample promised object!'
});
} else {
// reject in case of error or unsuccessful operations
reject(new Error('No global myApp object found!!'));
}
}, 2000);
});
});
Now you could easily require it, or add it as a dependency. What will happen is, it waits for your promise to get resolved then you will have the promised module object.
// main.js
require(['promisedModule'],
function (promisedModule) {
console.log(promisedModule.log);
// =>This is just a sample promised object!
console.log(promisedModule.app);
});
###Note: we are still discussing about the proper way of handling the rejected state of a promised module. Any feedback or proposal is really appreciated.
##use() vs require() You can also have the same modules flow using a new offered syntax by DefineJS:
use(['dependency1', 'dependency2'])
.then(function (dependency1, dependency2) {
// ...
return dependency1.util;
})
.then(function (util) {
// ...
// use util object if it has any useful functionality
return util.map([ /*...*/ ]);
})
.catch(function (e) {
// in case of having a rejected promised module or any async error
console.error(e);
});
Click on the post title to leave your comments (powered by Disqus)
##AMD
The primary building block for referencing and defining modular JavaScript code.
The Asynchronous Module Definition (AMD) API specifies a mechanism for defining modules such that the module and its dependencies can be asynchronously loaded.
No need to discuss the definition further, it is accurately enough to see the starting point clearly.
It is all about writing clean, testable, understandable and maintainable code. There might be more descriptive adjectives here but what we actually mean when we discuss these points, to a large extent, could be summarized in one single principle which is writing modular code.
As an aged JavaScript developer I could remember debates and sometimes actual wars, around this very topic. When I hear the other team members, mostly from a more structural programming language, whispering:
WTF? What the hell is he talking about! He asks us to do the impossible: writing clean code in JavaScript. Has anyone seen it, for real?
To be honest the last time I heard someone saying that was just two weeks ago, of course someone with a Shell/Ruby/Python background and not a JavaScript developer, but it was still a lot to me.
These days, as JavaScript developers we can implement almost anything, a ROBOT or an end to end enterprise solution with a friendly and fun Javascript Fullstack. It means that JavaScript scales up as it finds more implications in different areas. And as it goes on, we necessarily need to have a general mechanism with a shared understanding around it, a general mechanism for defining independent and interchangeable pieces which can work together perfectly. This is what modular programming, gives us:
Modular programming is a software design technique that emphasizes separating the functionality of a program into independent, interchangeable modules, such that each contains everything necessary to execute only one aspect of the desired functionality. Conceptually, modules represent a separation of concerns, and improve maintainability by enforcing logical boundaries between components.
Writing Modular JavaScript
This mechanism has already been thought out and we now have a couple of great modular coding formats, and to me these three are the most exciting ones:
- AMD: The Asynchronous Module Definition (AMD) API specifies a mechanism for defining modules such that the module and its dependencies can be asynchronously loaded. This is particularly well suited for the browser environment where synchronous loading of modules incurs performance, usability, debugging, and cross-domain access problems.
- CommonJS: Unfortunately, it was defined without giving browsers equal footing to other JavaScript environments ….
- ES6 Modules: What we are going to have in the next version of JavaScript, Harmony.
These three ways each seems to have bunch of pros and cons, but more importantly each has its own syntax format, which makes it difficult to use them interchangeably.
##DefineJS DefineJS is a lightweight implementation of AMD module format. Other than regular AMD module pattern, DefineJS also offers couple of nonstandard but usefull modular coding patterns.
Notes
There are couple of important points which I have faced with during the implementation of this module loader. All of them bring up one simple question:
As library authors are we better off implementing everything needed, the best and the worst practices all mixed together?
OR
Some might say, a great library is the one which prevents its developers from getting drowned in a bad code.
Since a module loader needs to be compatible even with possible uses of what is already known as a bad practice, my answer when implementing DefineJS was YES to the first question.
For instance, when working with an AMD module loader, you can explicitly name modules yourself, but it makes the modules less portable and if you move the file to another directory you will need to change the name. BUT it still is there and you can use it to define named modules.
Click on the post title to leave your comments (powered by Disqus)
Dave Herman says:
For the Web to compete with native platforms, I believe we have to think big. This means building on our competitive strengths like URLs and dynamic loading, as well as taking a hard look at our platform’s weaknesses — lack of access to modern hardware, failures of the offline experience, or limitations of cross-origin communication, to name a few.
As JavaScript enthusiasts and front end developers, this is what we are mostly concerned with and just by taking a quick look at the latest JavaScript trends and technology stacks around it, we will see the unleashed power of JavaScript as a programming language.
This is my first post here and I thought why not start off with one of the latest exciting talks which caught my eye when I first looked at the title:
Jaswanth Sreeram: Parallel JavaScript
Let’s imagine what could be the possible implications of this talk.
Not being able to create parallel threads in JavaScript might be considered as a missing feature, but the JavaScript Concurrency model and Event Loop have always helped us to meet the project’s requirements.
Amazingly, Web workers (The Basics of Web Workers) also provide a simple means for web content to run scripts in background threads. It still has its own rules and restrctions but in terms of optimization and performance, Web Workers are really helpful.
Using JavaScript asynchronous APIs also provides us with an interesting way of having a parallel process in one thread, which is basically possible totally because of the JavaScript’s Concurrency model and Event Loop.
Considering all these, when I first read the title(Parallel JavaScript) I thought it would be somehow related to one of the points above, BUT interestingly it is not.
That’s why I found this talk is really interesting. Just imagine what “Parallel JavaScript on GPU” would bring us, even without us having to deal with the different implications of it, to make it compatible with all the different hardwares.
I really liked it because I believe it is totally alighed with JavaScript’s future path.
Really highly recommended to watch and in case you are more interested to read, you could find the text here:
Jaswanth Sreeram: Parallel JavaScript - Transcript
Click on the post title to leave your comments (powered by Disqus)