These days I get all my Emacs packages from
MELPA. The only problem is
keeping track of them somewhere! Up until today I'd been storing the
~/.emacs.d/elpa directory in Git but that just won't do
- Package updates require frequent, unwanted commits
- Frequent commits cause the Git repo to grow quite large
So today I set out to solve this problem by storing a list of required packages, checking if the packages exist on startup, and installing them as necessary.
The first thing to realize is that there are two things whos existence we need to check for:
archive-contentsfile for every archive in the
- The required packages
Let's start with the required packages, which is just a list of the packages I want installed. (All from MELPA)
(defvar required-package-names (list 'color-theme-solarized 'ace-jump-mode 'csharp-mode 'expand-region 'gnuplot-mode 'go-mode 'haskell-mode 'htmlize 'log4j-mode 'markdown-mode 'modeline-posn 'multi-web-mode 'nrepl 'paredit 'powershell-mode 'smex 'zencoding-mode) "List of package.el packages that should be installed if not present")
Next we need a function to
package-initializeto initialize package.el
- Add MELPA to
package-archives- the list of archives that packages can be installed from
- Check if the cached package lists exist or needs to be fetched
- Loop over
required-package-namesand install any packages that aren't already present
(defun verify-required-packages () "Verify that all required package.el packages are installed and install them if necessary" (package-initialize) (install-package-archives) (unless (package-contents-exist-p) (package-refresh-contents)) (dolist (pkg required-package-names) (unless (package-installed-p pkg) (package-install pkg))))
Pretty straight forward stuff. Notice that tasks #2 and #3 have been
farmed out to
package-contents-exist-p. The latter has been reproduced below. (The
former is super simple and not germane here)
(defun package-contents-exist-p () "Determine if cached package.el archive contents exist." (let ((exist-p t)) (dolist (archive package-archives) (let* ((dir (concat "archives/" (car archive))) (contents-file (concat dir "/archive-contents")) (filename (expand-file-name contents-file package-user-dir))) (unless (file-exists-p filename) (setq exist-p nil)))) exist-p))
This solution addresses the shortcomings associated with storing all the packages in Git. Overhead is minimal: we take a once-off hit the first time we start Emacs with a freshly cloned .emacs.d repo. As usual you can find all the code on GitHub.
My previous attempts at blogging have always been a potpourri of technical posts, political rants, musings, and divulgances. I tagged posts appropriately so that my readers (all two of them) could choose to ignore things they weren't interested in, but in retrospect I found many of the non-technical posts embarrassing and wound up deleting them. What I'd like to do instead is to have seperate blogs and keep each one topical. To this end, I've added multi-site support to Coleslaw. (See my forked repo on GitHub)
I'm going to describe the changes made on a per file basis.
Coleslaw is invoked via a Git post-receive hook. (A Bourne shell
script) On startup it reads its config file,
contains a single plist of blog parameters:
(:author "Ralph Moritz" :deploy "/home/coleslaw/www/lisp-is-fun/" :domain "http://blub.co.za" :feeds ("lisp") :plugins (mathjax) :repo "/home/coleslaw/tmp/lisp-is-fun/" :sitenav ((:url "http://twitter.com/ralph_moeritz" :name "Twitter") (:url "http://github.com/ralph-moeritz" :name "Code")) :title "(lisp :is 'fun)" :theme "hyde")
The first thing I realized was that coleslawrc will need to contain multiple plists (one per blog), keyed by some unique value. (title?)
Each blog will have its own Git repo with a post-receive script that invokes Coleslaw, supplying the config key in the process.
Having a look at my post-receive script (customized for CCL), I
TMP_GIT_CLONE would make a good key since it's already
present in both the post-receive script and coleslawrc. (See the
:repo argument in the previous code snippet)
GIT_REPO=$HOME/lisp-is-fun.git # TMP_GIT_CLONE _must_ match the :repo arg in coleslawrc excluding trailing slash TMP_GIT_CLONE=$HOME/tmp/lisp-is-fun git clone $GIT_REPO $TMP_GIT_CLONE echo "(ql:quickload :coleslaw) (coleslaw:main)" | ccl -b rm -Rf $TMP_GIT_CLONE exit
Brit told me a while ago that he doesn't like having to maintain multiple versions of the post-receive script, so I've consolidated both of our versions into an "überscript" that
$0(the name used to invoke the script)
- Invokes the correct command based on the value of
- Passes the value of
coleslaw::main, which now accepts a single argument.
I won't make this post any longer by including the script, but if you're interested you can view it here.
As mentioned above,
coleslaw::main takes the config key as an
argument; it passes this value to
coleslaw::load-config, which is
reading the plist from coleslawrc
- Using the plist as an argument to
make-instancein order to create a
coleslaw::load-pluginsto compile and load the plugins listed in the plist.
Since coleslawrc now potentially contains more than one plist we need to decide on a new format. I decided to support two distinct scenarios:
- Single site
- Multiple sites
For the former I decided to retain the current coleslawrc format to
avoid breaking existing deployments. For the latter I decided to use
an alist keyed by the cloned repo path. (No need to specify the
:repo property in this case)
(("/path/to/repo/one/" . (:plist sans :repo)) ("/path/to/repo/two/" . (:plist sans :repo)) ("/path/to/repo/three/" . (:plist sans :repo)) ;; ... )
As mentioned previously, the meat is in
now has to differentiate between single and multi-site setups and load
the correct plist, either by simply reading the plist in as usual, or
looking it up in the containing alist by the config key (repo path). I
won't bore by going into detail when the source says it all:
(defun load-config (config-key &optional (dir (user-homedir-pathname))) "Load the coleslaw configuration for CONFIG-KEY from DIR/.coleslawrc. DIR is ~ by default." (with-open-file (in (merge-pathnames ".coleslawrc" dir)) (let ((config-form (read in))) (if (symbolp (car config-form)) ;; Single site config: ignore CONFIG-KEY. (setf *config* (apply #'make-instance 'blog config-form)) ;; Multi-site config: load config section for CONFIG-KEY. (let ((config-key-pathname (cl-fad:pathname-as-directory config-key)) (section (assoc config-key-pathname config-form :key #'(lambda (str) (cl-fad:pathname-as-directory str)) :test #'equal))) (if section (progn (setf *config* (apply #'make-instance 'blog (cdr section))) (setf (slot-value *blog* 'repo) config-key)) (error 'unknown-config-section-error :text (format nil "In ~A: No such key: '~A'." in config-key))))) (load-plugins (plugins *config*)))))
The end result is that Coleslaw now supports mutliple blogs and thanks to its succinct codebase and the malleability of Lisp's data structures there really wasn't much work involved.
I've been playing around with cl-dbi, which looks like yet another excellent library by Eitarow Fukamachi of Clack fame. While writing a simple program to get better acqainted with the API, I couldn't help but notice some unecessary duplication of effort and decided to write a small macro. The purpose of this post is to provide a simple example of macros for blub programmers, and to make the case that judicious use of macros is superior to rote memorization of so-called "best practices".
The cl-dbi sample code on GitHub looks like this:
(defvar *connection* (dbi:connect :mysql :database-name "test" :username "nobody" :password "1234")) (let ((query (dbi:prepare *connection* "SELECT * FROM somewhere WHERE flag = ? OR updated_at > ?")) (result (dbi:execute query 0 "2011-11-01"))) (loop for row = (dbi:fetch result) while row ;; process "row". ))
Pretty standard for a database API: we connect, execute some queries and...oh, wait a minute - we forgot to disconnect! That's okay since this is just sample code, but in the real world we'd be in trouble if we left our connections hanging around. Our code should read:
(defvar *connection* (dbi:connect :mysql :database-name "test" :username "nobody" :password "1234")) ;; Execute some queries. (Omitted for brevity) (dbi:disconnect *connection*)
Much better, but hang on, aren't we still forgetting something? Of course: error handling! If the code executing the queries signals an error condition then we may never get around to disconnecting. So let's fix that:
(defvar *connection* (dbi:connect :mysql :database-name "test" :username "nobody" :password "1234")) (unwind-protect (progn ;; Execute some queries. ) (dbi:disconnect *connection*))
Now we seem to have all our bases covered: that
dbi:disconnect gets called no matter what.
So is that it, have we found the high road? Not quite. In most blub languages this is the best we'd be able to do: we'd shrug our shoulders and resign ourselves to having to write this bit of boring book keeping code over and over again. Thankfully, this is Lisp and we have macros to save us from mundania:
(defmacro with-connection ((conn-sym &rest rest) &body body) `(let ((,conn-sym (dbi:connect ,@rest))) (unwind-protect (progn ,@body) (dbi:disconnect ,conn-sym))))
Neato! Now we can write:
(with-connection (conn :mysql :database-name "test" :username "nobody" :password "1234") (let ((query (dbi:prepare conn "SELECT * FROM somewhere WHERE flag = ? OR updated_at > ?")) (result (dbi:execute query 0 "2011-11-01"))) (loop for row = (dbi:fetch result) while row ;; process "row". )))
Which will expand to:
(let ((conn (dbi:connect :mysql :database-name "test" :username "nobody" :password "1234"))) (unwind-protect (progn (let ((query (dbi:prepare conn "SELECT * FROM somewhere WHERE flag = ? OR updated_at > ?")) (result (dbi:execute query 0 "2011-11-01"))) (loop for row = (dbi:fetch result) while row ;; process "row". ))) (dbi:disconnect conn)))
What we've done here is write some code to generate the boring bits for us. The benefits of this approach over memorizing so-called "best practices" should be fairly obvious.
More conciseA no-brainer.
Less error proneWe can't forget to disconnect anymore.
FlexibleIf we later decide to change our abstraction slightly, eg. add support for ambient transactions to
with-connection, this is no problem since we can just update our macro.