The Fennel programming language recently celebrated its fifth birthday, and we ran a survey to learn more about the community and what has been working well and what hasn't. Fennel's approach has always been one of simplicity; not just in the conceptual footprint of the language, but in reducing dependencies and moving parts, and using on a runtime that fits in under 200kb. In order to reflect this, the Fennel web site is hosted as static files on the same Apache-backed shared hosting account I've been using for this blog since 2005.
Of course, generating HTML from lisp code is one of the oldest tricks in the book[1], so I won't bore anyone with the details there. But what happens when you want to mix in something that isn't completely static, like this survey? Well, that's where it gets interesting.
I put the shotgun in an Adidas bag and padded it out with four pairs of tennis socks, not my style at all, but that was what I was aiming for: If they think you're crude, go technical; if they think you're technical, go crude. I'm a very technical boy. So I decided to get as crude as possible. These days, though, you have to be pretty technical before you can even aspire to crudeness.[2]
When I was in school, I learned how to write and deploy Ruby web programs. The easiest way to get that set up was using CGI. A CGI script is just a process which is launched by the web server in such a way that the request comes in on stdin and environment variables and the response is sent over stdout. But larger Ruby programs tended to have very slow boot times, which didn't fit very well with CGI's model of launching a process afresh for every request that came in, and eventually other models replaced CGI. Most people regard CGI as somewhat outmoded and obsolete, but it fits Fennel's ethos nicely and complements a mostly-static-files approach.
So the survey generates an HTML form in a static file which points to a CGI script as its action. The CGI script looks like this, but it gets compiled to Lua as part of the deploy process to keep the dependencies light.
(let [contents (io.read "*all") date (io.popen "date --rfc-3339=ns") id (: (date:read "*a") :sub 1 -2)] (with-open [raw (io.open (.. "responses/" id ".raw") :w)] (raw:write contents)) (print "status: 301 redirect") (print "Location: /survey/thanks.html\n"))
As you can see, all this does is read the request body using io.read, create a file with the current timestamp as the filename (we shell out to date because the built-in os.time function lacks subsecond resolution) and prints out a canned response redirecting the browser to another static HTML page. We could have printed HTML for the response body, but why complicate things?
At this point we're all set as far as gathering data goes. But what do we do with these responses? Well, a typical approach would be to write them to a database rather than the filesystem, and to create another script which reads from the database whenever it gets an HTTP request and emits HTML which summarizes the results. You could certainly do this in Fennel using nginx and its postgres module, but it didn't feel like a good fit for this. A database has a lot of moving parts and complex features around consistency during concurrent writes which are simply astronomically unlikely[3] to happen in this case.
At this point I think it's time to take a look at the Makefile:
upload: index.html save.cgi thanks.html 2021.html rsync -rAv $^ fennel-lang.org:fennel-lang.org/survey/ index.html: survey.fnl questions.fnl ../fennel/fennel --add-fennel-path "../?.fnl" $< > $@ save.cgi: save.fnl echo "#!/usr/bin/env lua" > $@ ../fennel/fennel --compile $< >> $@ chmod 755 $@ pull: rsync -rA fennel-lang.org:fennel-lang.org/survey/responses/ responses/ 2021.html: summary.fnl chart.fnl questions.fnl responses/* commentary/2021/* ../fennel/fennel --add-fennel-path "../?.fnl" $< > $@
So the pull target takes all the raw response files from the server and brings them into my local checkout of the web site on my laptop. The 2021.html target runs the summary.fnl script locally to read thru all the responses, parse them, aggregate them, and emit static HTML containing inline SVG charts. Then the upload task puts the output back on the server. Here's the code which takes that raw form data from the CGI script and turns it into a data structure[4]:
(fn parse [contents] ; for form-encoded data (fn decode [str] (str:gsub "%%(%x%x)" (fn [v] (string.char (tonumber v 16))))) (let [out {}] (each [k v (contents:gmatch "([^&=]+)=([^&=]+)")] (let [key (decode (k:gsub "+" " "))] (when (not (. out key)) (tset out key [])) (table.insert (. out key) (pick-values 1 (decode (v:gsub "+" " ")))))) out))
The final piece I want to mention is the charts in the survey results. I wasn't sure how I'd visualize the results, but I had some experience writing SVG from my programmatically generated keyboard cases I had constructed on my laser cutter. If you've never looked closely at SVG before, it's a lot more accessible than you might expect. This code takes the data from the previous function after it's been aggregated by response count and emits a bar chart with counts for each response. Here's an example of one of the charts; inspect the source to see how it looks if you're curious:
I had never tried putting SVG directly into HTML before, but I found you can just embed an <svg> element like any other. The <desc> elements even allow it to be read by a screen reader.
(fn bar-rect [answer count i] (let [width (* count 10) y (* 21 (- i 1))] [:g {:class :bar} [:rect {: width :height 20 : y}] [:text {:x (+ 5 width) :y (+ y 12) :dy "0.35em"} (.. answer " (" count ")")]])) (fn bar [i data ?sorter] ;; by default, sort in descending order of count of responses, but ;; allow sorting to be overridden, for example with the age question ;; the answers should be ordered by the age, not response count. (fn count-sorter [k1 k2] (let [v1 (. data k1) v2 (. data k2)] (if (= v1 v2) (< k1 k2) (< v2 v1)))) (let [sorter (or ?sorter count-sorter) answers (doto (icollect [k (pairs data)] k) (table.sort sorter)) svg [:svg {:class :chart :role :img :aria-label "bar graph" :aria-describedby (.. "desc-" i) :width 900 :height (* 21 (+ 1 (length answers)))}] descs []] (each [i answer (ipairs answers)] (table.insert svg (bar-rect answer (. data answer) i)) (table.insert descs (.. answer ": " (. data answer)))) (table.insert svg [:desc {:id (.. "desc-" i)} (table.concat descs ", ")]) svg)) {: bar}
In the end, other than the actual questions of the survey, all the code clocked in at just over 200 lines. If you're curious to read thru the whole thing you can find it in the survey/ subdirectory of the fennel-lang.org repository.
As you can see from reading the results, one of the things people wanted to see more of with Fennel was some detailed example code. So hopefully this helps with that, and people can learn both about how the code is put together and the unusual approach to building it out.
[1] In fact, the HTML generator code which is used for Fennel's web site was written in 2018 at the first FennelConf.
[2] from Johnny Mnemonic by William Gibson
[3] If we had used os.time with its second-level granularity instead of date with nanosecond precision then concurrent conflicting writes would have moved from astronomically unlikely to merely very, very unlikely, with the remote possibility of two responses overwriting each other if they arrived within the same second. We had fifty responses over a period of 12 days, so this never came close to happening, but in other contexts it could have, so choose your data storage mechanism to fit the problem at hand.
[4] This code is actually taken from the code I wrote a couple years ago to handle signups for FennelConf 2019. If I wrote it today I would have made it use the accumulate or collect macros.