Saturday, August 25, 2012

Automatic Github Pages generation from Sphinx documentation

Sphinx is a very common documentation tool which gobbles up ReStructuredText and other free-form markup formats and outputs great HTML, PDF and other formats. It is meant for reference manuals and API documentation due to its good integration with source code (especially Python). The libuv book is written using Sphinx so that it looks so good with minimum effort.

Sphinx uses make to generate the HTML, which is great. The only problem is that deploying this to Github Pages requires multiple commands to switch branches to gh-pages, pull in the source text, then cleanup the working copy and switch back to master. This is boring after about one time, so I automated it, and I think other projects can benefit from it as well. Once you follow the instructions, running:

make gh-pages

will take the latest commit, switch to the gh-pages branch, generate HTML, push it to Github, then clean everything up and switch back to master.

NOTE: You need to commit or revert any working copy modifications before running this.

One-time commands

These steps only need to be run the first time when you want to generate Github Pages. First setup the branch to have no parents. Let me stress again the importance of making sure all changes are committed! Otherwise they’ll be lost.

$ cd repo
$ git checkout --orphan gh-pages
$ git rm -rf .
$ echo "First commit" > index.html
$ git add .
$ git commit -m "Just to create the branch."
$ git push origin gh-pages

Now the gh-pages branch is setup. We can start generating the actual pages instead of the current index.html.

Set the source files

Edit the Sphinx Makefile. Add a variable GH_PAGES_SOURCES. This should be a list of the files/directories that contain the documentation sources. This will usually be only source which contains the Sphinx reST docs, but if you are embedding external code or images, those directories have to be listed as well. In addition the Makefile has to be in the list. For the libuv book it is:

GH_PAGES_SOURCES = source code libuv Makefile

Add the target

Create a target gh-pages with the following commands (remember to use TABs in Makefiles):

gh-pages:
    git checkout gh-pages
    rm -rf build _sources _static
    git checkout master $(GH_PAGES_SOURCES)
    git reset HEAD
    make html
    mv -fv build/html/* ./
    rm -rf $(GH_PAGES_SOURCES) build
    git add -A
    git ci -m "Generated gh-pages for `git log master -1 --pretty=short --abbrev-commit`" && git push origin gh-pages ; git checkout master

Here is how it goes. The checkout simply switches branches. Then we remove all the old data to prevent any rebuilding artifacts. Since the gh-pages branch won’t have any of your original data, but only the HTML output, we need to pull the sources from the master branch. Then we generate the HTML. We move these from the build folder to the top level. Then we remove all the sources and the now empty build folder. We stage all the changes. Finally the last line generates a commit message for the gh-pages changes which is the first line of the latest commit on master and pushes to Github. The reason the git checkout master command is on the same line and semi-colon separated; if the push were to fail for some reason (network error, DAG inconsistency), I want to be returned to master in my working copy. If you don’t want this to happen, feel free to move it to a new line on its own.

Done!

You can now get back to your main task - writing great documentation. Whenever you have an urge to show it to the world, simply run make gh-pages and your latest documentation is served fresh!

Friday, August 17, 2012

Installing node.js on the Raspberry Pi Archlinux

My Raspberry Pi arrived a few weeks ago and I had some problems with getting node.js to build on it. This post is specifically about building node.js from source on Archlinux.

All activity and commands in this tutorial are run on the Raspberry Pi, either via SSH or physical keyboard. It is possible to cross-compile on your laptop/desktop for ARM, but I preferred not to in this case.

Prerequisites

Have the base-devel group installed, you should have a working gcc and friends, along with openssl and zlib. Also install python2-virtualenv and git-core.

Get the source

I usually stay on node bleeding edge so:

$ git clone git://github.com/joyent/node

Use Python2

Since Archlinux uses python3 by default, the configure and build scripts screw up. You can edit the individual files, but I find it easier to just use virtualenv, which will setup the shell environment to use python2 and its libraries.

$ virtualenv2 env
$ source env/bin/activate

Here env is the name of the directory virtualenv will use to store scripts. You can use some other name. Sourcing the activate script will set up the environment so that python will actually be python2.

Patch the source

Geoff Flarity has created a patch to tweak the build parameters a bit. You can either follow the instructions to apply the patch, or just comment the 12 vfp3:on... lines yourself, which is what I did. Also set the environment variables to build for ARM as in the README.

export GYP_DEFINES="armv7=0"
export CCFLAGS='-march=armv6'
export CXXFLAGS='-march=armv6'`

NOTE: All commands should be run in the same shell that has the virtualenv environment and the above variables set.

Use system zlib and openssl

Attempting to compile the included openssl failed on my Archlinux setup with some ARM assembly errors. Using the installed libs will also reduce the compilation time. So

$ ./configure --shared-openssl\
--shared-openssl-includes=/usr/include/openssl\
--shared-openssl-libpath=/usr/lib\
--shared-zlib\
--shared-zlib-includes=/usr/include\
--shared-zlib-libpath=/usr/lib

Build

$ make
$ su -c 'make install'

The build will take some time (30 minutes to an hour), so be patient. Once it is done you will have a fully working node runtime on your Raspberry Pi. Use it to power your home automation control server or whatever else. I use it to experiment with my DHT implementation.

The use of node.js to write low-level systems services or networking code over traditional languages like C/C++ is very interesting, because it provides a safer, garbage-collected runtime, with efficient I/O and a less verbose language. Inexpensive devices like the Raspberry Pi allow experimenting with multiple devices or peer-to-peer configurations rather than being bound to only localhost testing for those on a tight budget. So get a Pi and have fun.