Showing posts with label programming. Show all posts
Showing posts with label programming. Show all posts

December 20, 2011

Programming considered harmful

  • gotos make it possible to write bad programs
  • threads make it possible to write bad programs
  • global variables make it possible to write bad programs
  • anonymous functions make it possible to write bad programs
  • macros make it possible to write bad programs
  • mutable variables make it possible to write bad programs
  • continuations make it possible to write bad programs
  • dynamic scoping makes it possible to write bad programs
  • objects make it possible to write bad programs
  • recursion makes it possible to write bad programs
  • ...

Take this argument far enough, and you are left with the S-K combinators, and now it is impossible to write good programs.

Having few features in a programming language is a fault, not a virtue. The bigger fault lies in failing to provide the language with the facilities to be extended with new features.

No amount of language design can force a programmer to write clear programs.

--Guy Steele & Gerald Sussman

August 28, 2011

Programmer myopia

We like to assume that people are basically competent and rational. Computer programmers enjoy pretending they are more competent and rational than people in other professions.

In many cases neither of those assumptions is true.

Two popular memes surrounding programming languages in the 80s and 90s were the assertions that "garbage collection is too slow" and that "dynamic typing doesn't work for large programs."

Many programmers were convinced both those things were true. In hindsight they were completely wrong, as languages such as Tcl (more on the importance of Tcl in the history of programming languages in an upcoming post), Perl, Java and Python "dragged people halfway to Lisp" and changed public perception.

How could so many people who consider themselves above-average in competence and rationality be so wrong? (Assume that every programmer is indeed a special snowflake and the Dunning-Kruger effect doesn't apply).

A hermit spent 10 years writing a program. 'My program can compute the motion of the stars on a 286-computer running MS DOS,' he proudly announced. 'Nobody owns a 286-computer or uses MS DOS anymore,' Fu-Tzu responded.

Eloquent JavaScript, Marijn Haverbeke


The problem is that programmers seem unable to think even a couple of years into the future. People complaining about garbage collection in the 80s were looking back at their existing 8-bit Trash-80s instead of at contemporary computers being produced and the future computers being planned. The idea that computers can be good at automating rote tasks like managing memory and checking and inferring types never occured to them.

People have trouble imagining the future even if the trends, such as Moore's law, are in front of them. It takes a very long time for people to understand the right ideas. Just ask Alan Kay. Being able to find the appropriate point of view really is better than a high IQ.

Here are some other Lisp concepts that programmers believe out of ignorance that will take a long time to dispel:
  • tail recursion is unnecessary and makes debugging difficult
  • macro-based metaprogramming results in unmaintainable programs

August 12, 2011

Smart enough to predict stupidity

I think the root of your mistake is saying that macros don't scale to larger groups. The real truth is that macros don't scale to stupider groups.
--Paul Graham on ll1

People who design programming languages sometimes like to imagine an idealized "average programmer" who will employ their design. The underlying assumption being that the language designer is smarter than the "average programmer," and will set out to protect the latter from their own incompetence.

The arrogance behind this view is twofold - not only is the language designer deeming himself objectively smarter than other people, but that he will be able to predict how other people's stupidity will play out. In view of this egotism, the (lack of) quality of the end result should not be surprising.

This objection -- "but bad programmers will make a mess of it" -- is the stock objection everybody makes to every unorthodox programming construct. Since it is an objection to everything, it is an objection to nothing.
--Daniel Gackle on programming language features

April 16, 2011

Programming is a creative pursuit

There is still some debate around whether programming qualifies as a creative endeavor akin to writing, arts, or crafts. Paul Graham attempts to draw analogies between hacking and painting (unconvincingly, some argue).

The answer is a strong positive if you examine the motivational factors (examining motivation to get better insights is something that I have emphasized before).

How else can you explain the motivational factors of people working on Free Software? Of people programming at work, and then going home and programming as a hobby? Of working on multiple, related and unrelated, projects simultaneously, sometimes over periods of years or decades at a time?

Another obvious but almost never discussed aspect of programming as a creative pursuit is that it is almost impossible to succeed in programming as a career if you do not enjoy your work. This is true for all creative professions, but can you argue the same for plumbers or assembly-line workers or, closer to the idiotic "knowledge worker" label, accountants?

Succeeding as a programmer of course has nothing at all to do with succeeding at being employed as a programmer, amusingly enough because of the widespread belief that programming is a non-creative profession and that 9 women can make 1 baby in 1 month. With perverse incentives such as "lines of code written" (when the only good thing about lines of code is how many you can remove) and no understanding by management of the impact of such things as technical debt, unit testing, or even basic things like quality, hapless code monkeys can stay on the payroll. But how many of them are recognized (in a positive way, mind you) by their peers? How many of them choose to continue to do programming into their 40s? The hapless code monkeys usually switch careers or "advance" themselves into the ultimate bastion of incompetence: management.

March 22, 2011

Abstraction, indirection, and programming languages

A common mistake programmers seem to make is assuming that abstraction is about putting a layer of indirection (whether through function calls or data structures) into a program. Indirection is one of the more commonly used tools to implement abstractions, but just because there is now an extra layer of function calls in your program doesn't make it more understandable, maintainable, or a better fit to the domain. Those are better (although somewhat subjective) criteria for what makes effective abstractions than layers of indirection.

Another thing programmers like to do is argue about programming languages and the abstractions (over machine language) they provide. Oftentimes you'll hear "language A has feature X" and the refrain "language B doesn't have feature X and is good enough, so feature X isn't important/is harmful." When iterated enough times, it's easy to see that this becomes a sort of reductio ad assembler argument.

Is a sufficiently powerful assembler good enough? So what makes C better than assembler? Why is C special? And what's better than C?

It is productive to be able to both ask and answer the last question, which is why real metaprogramming is an invaluable thing to have.

December 11, 2010

Programming language evolution

Programming language evolution is sometimes claimed to be a search for greater abstraction.

This is wrong, and mirrors the misunderstanding of biological evolution. There is no end-goal, no such thing as progress or improvement. It's about adaptation.

There are different kinds of models and different kinds of environmental factors, such as machine constraints, and how ignorant of mathematics and past programming languages the designers of a programming language are.

A good example of this is MapReduce. The programming paradigm goes back to APL. Limited support for parallelism was added in the form of vector co-processors on IBM mainframes. The first "massively" parallel implementation of the idea was StarLisp.

Another example is Objective C. Brad Cox claims that the only reason he developed Objective C was because at the time Xerox wouldn't sell Smalltalk (Masterminds of Programming, p. 258).

In terms of parameters like dynamism and introspection, Lisp already had all the necessary features in 1958. From there the same basic idea of lambda abstraction has been applied to Actors and prototype-based languages like Self and JavaScript, but in terms of other "dynamic" programming languages, almost all of them are a step backward from 1958 Lisp.

The same thing can be said about C and Algol 68.

November 16, 2010

Character encoding is about algorithms, not datastructures

One thing you might be aware of is that both SBCL and Clozure represent characters using 4 bytes. There's been significant discussion about this already, but I hope I can offer more insight into how you can apply this to your Lisp applications.

First, the one thing most people seem to agree on is that UTF-16 is evil (it should be noted that both CLISP and CMUCL use UTF-16 as their internal character representation).

The important thing about UTF-32 vs. UTF-8 and UTF-16 is that it is not primarily a question of string size, but of algorithms.

Variable-length encodings work perfectly fine for stream algorithms. But string algorithms are written on the assumption of constant-time random access and being able to set parts of a string to certain values without consing. These assumptions can't be satisfied when variable-length encodings are used, and most string algorithms would not run with any level of acceptable performance.

What about immutable strings? Random access is still not constant-time for variable-length encodings, and all the mutator operations are gone. In effect most immutable string algorithms actually end up being stream algorithms that cons a lot.

Chances are, you're already using stream algorithms on your strings even if you're not aware of it. Any kind of search over a string really treats that string as a stream. If you're doing string concatenation, you're really treating your strings as though they were immutable - consider using with-output-to-string to cut down on consing and to simplify your code.

One thing that is special about UTF-8 is that it is the de-facto character encoding standard of the web. When you're reading UTF-8 into a Lisp string, what you're really doing is decoding and copying each character.

Most web application patterns are based around searching for and extracting some string from the HTTP request, and either using that string as a retrieval key in a database or splicing it together with some other strings to form the reply. All these actions can be modeled in terms of streams.

One of the things that makes John Fremlin's tpd2 fast is that it dispenses with the step of decoding and copying the incoming UTF-8 data into a Lisp string (this is also what antiweb does). Using some compiler macros and the cl-irregsexp library, all the searching and templating is done on byte arrays that hold the UTF-8 encoded data (Lisp string literals are converted to UTF-8 byte arrays by compiler macros). The result is a UTF-8 byte array that can be sent directly back to the web browser, bypassing another character encoding and copying step.

I read somewhere that the Exokernel operating system permitted extending this concept further down the software stack by allowing applications to send back pre-formatted TCP/IP packets, although I don't know if that's actually possible or how much of a speedup it would give.

In addition to skipping the overhead of copying and re-encoding characters, working on UTF-8 byte sequences directly means you can use algorithms that depend on working with a small alphabet set to achieve greater performance (an example of this is the Boyer–Moore–Horspool string searching algorithm).

May 2, 2010

Postmodern programming

Before the idea of postmodern programming can begin to be investigated, the question of whether anything like modernist and classical programming even exists needs to be asked.

It's not surprising that the question of how to define postmodern programming is reframed by the OO contingent (reflecting their ignorance as much as their immature self-obsession) as literally "what comes after object-orientation?" That is a silly question to ask when you don't know what object-oriented means or what came before.

James Noble and Robert Biddle attempted to address the issue in their Notes on Postmodern Programming, but focused on the act of writing programs and left the question of programming paradigms unexamined.

One possible interpretation gives surprisingly straightforward definitions: the development of the idea of algorithms constitutes the age of classical programming, while procedural and data abstraction, being the rationalization of the construction and application of algorithms, constitute modern programming.

What then is the narrative of a modern program? The evaluation strategy. Algorithms are executed in steps. Modernist programming promotes rationalization in laying out these steps. Postmodern programming rejects the linear evaluation strategy.

Curiously, we can arrive at the same conclusion by framing the relationship between procedural and functional programming in terms of a dialectic:

The thesis of procedural programming is the description of programs in terms of time (sequential execution of instructions, or branching) and behavior (the semantics of instructions), as entities operating on data - object identity, and state of addressable (nameable) places (registers, variables, arrays, etc.).

Functional programming presents a (quite literal) antithesis: programs are described in terms of data - identity (named functions) and state (first-class functions) - and operate on time (persistent/immutable data structures) and behavior (monads).

The synthesis is a nondeterministic, reified (homoiconic), first-class program. The program exists for the sake of itself, becomes the object of study and center of efforts. The computational narrative of the evaluation strategy escapes control, and so from the point of view of the programmer becomes irrelevant.

How, when, and why certain parts of the program are evaluated becomes subjective.

Perl wasn't the first postmodern programming language; Prolog was.

Non-determinism as the rejection of computational narrative and the reification of time arise naturally from physical constraints when attempting to reason about concurrency. Most of the modernist concurrency techniques concern themselves with maintaining what I call the global state illusion, or quite literally forcing a single narrative on a distributed system. Not only does the current state of the system remain unknown and unknowable, but its past history permits an exponential number of equally valid, relative interpretations.

Rich Hickey's 2009 JVM summit presentation explores these concepts of concurrency and time in a thought-provoking manner.

November 5, 2009

Recruiting Puzzles

One of the most interesting parts of Peter Seibel's Coders at Work (see my review on Slashdot) was Peter Norvig's discussion of the Google programmer hiring process.

The impact of a bad hire on a programming team can be extremely negative - vastly increased code maintenance costs and reduced morale and productivity. "Hire slowly" is now becoming an ingrained mantra in software companies. Despite this, effective processes for evaluating candidate programmers still seem to be little known.

Interview puzzle questions (made (in)famous by Microsoft) as a tool in the hiring process came up in Norvig's interview. Not surprisingly he is against the idea. For a while puzzle questions were a fad in programmer interviews, based on the (unfounded and unverified) assumption that they were a predictor of coding prowess.

To examine why puzzle questions are a dumb idea, you need to look at the objectives you are trying to fulfill when hiring programmers:

  1. Hire someone who is good at programming.

  2. Hire someone who is good at programming as part of your team.


There is no need to resort to unrelated tasks with a low degree of correlation to see whether someone will help fulfill these objectives. A candidate can be assessed directly and efficiently by inviting them for three hours of pair programming.

Why are you still wasting time asking how many golf balls can fit in an airplane?

October 18, 2009

The history of programming language syntax

http://news.bbc.co.uk/2/hi/technology/8306631.stm

Only if all the other "programming language designers" had the conscience to apologize for the pain their thoughtlessness has caused.

September 30, 2009

A better way to do screencasts

My friend Paddy Mullen recently released TerminalCast, a sort of online ttyrec player with synced audio (it is in fact based on a full in-browser JavaScript reimplementation of rxvt). This lets you do non-blurry screencast tutorials with full copy-and-paste support. I'm planning to make a Lisp web tutorial sometime in the near future.

Paddy will be giving a talk about TerminalCast at the next LispNYC meeting on October 13.

September 2, 2009

Coders at Work

Slashdot just ran my review of Peter Seibel's Coders at Work. The book comes out next week. Bottom line: pre-order you copy now!

June 8, 2009

Why your language needs macros

A couple of days ago someone posted yet another "why do I need macros?" thread on Hacker News. The usual arguments and unconvincing examples arguing for macros were posted. Noted absence were the arguments against macros (all of which seem to be of the variety "programmers are too dumb to understand someone else's macros" - if you were at the Great Macro Debate at ILC 09 you would have heard it stated more eloquently).

I've been thinking about macros in terms of domain-driven design lately, kicking around my idea of "domain onions" (I'll write more about this later), so I decided to post my current thoughts about why every programming language that aspires to be general-purpose needs macros.

May 21, 2009

April 22, 2009

Review: Let Over Lambda

So after a lot of not reading Doug Hoyte's Let Over Lambda, I finally did manage to read it all the way through.

My overall impression first: Hoyte styles the book as a successor to PG's On Lisp; I think Let Over Lambda falls short of that goal, although it does contain enough interesting material to make the book worthwhile.

The best parts of the book are the chapter on read macros and the subsection on sorting networks. Great, practical examples and illustrations of good programming style.

The worst parts of the book are the chapter on implementing a Forth interpreter in Lisp and the "caddar accessor generator" (it's ok for me to say this because the name of this blog is ironic).

The chapter on anaphoric macros has finally made me change my mind about those things. It's ok, use them when you need them. All the stuff about "sub-lexical scoping" (ie - various interesting ways of using macros to capture free variables) didn't really make a deep impression on me - maybe I'm just too dull to see any good uses for that.

As pointed out in other reviews, the book could have used a lot more proofreading (esp of the code) and editing. Hoyte chose to self-publish the book, which I think was a mistake (and just today Tim Ferriss blogged about some other reasons why it's not a good idea).

To cap the review, don't read Let Over Lambda until you've read On Lisp. It's a fun book about some interesting Common Lisp programming techniques, but it could have been shorter.

April 9, 2009

Closure-oriented metaprogramming via dynamically-scoped functions

Today I came across this post from the ll1 mailing list (almost 7 years old now, via Patrick Collison's blog) from Avi Bryant explaining how Smalltalk's message-based dispatch permits a type of metaprogramming with closures, as an alternative to macros.

Of course if you've read Pascal Costanza's Dynamically Scoped Functions as the Essence of AOP (and if you haven't, click the link and do it now; it's one of my favorite CS papers), you will realize that there is no need for message-based dispatch or any kind of object-oriented programming to do that. All we need are dynamically-scoped functions.

Here is how I approached the problem:

(defpackage "BAR"
(:use "COMMON-LISP")
(:shadow #:=))

(in-package "BAR")

(defmacro dflet1 ((fname &rest def) &body body)
(let ((old-f-def (gensym)))
`(let ((,old-f-def (symbol-function ',fname)))
(unwind-protect (progn (setf (symbol-function ',fname) (lambda ,@def))
,@body)
(setf (symbol-function ',fname) ,old-f-def)))))

(defmacro dflet* ((&rest decls) &body body)
(if decls
`(dflet1 ,(car decls)
(dflet* ,(cdr decls)
,@body))
`(progn ,@body)))

(defun first-name (x) (gnarly-accessor1 x))
(defun address-city (x) (gnarly-accessor2 x))
(defun = (&rest args) (apply 'common-lisp:= args))
(defmacro & (a b) `(block-and (lambda () ,a) (lambda () ,b)))
(defun block-and (a b) (when (funcall a) (funcall b)))

(defun some-predicate (x)
(& (= (first-name x) "John") (= (address-city x) "Austin")))

(defun make-parse-tree-from-predicate (predicate-thunk)
(dflet* ((first-name (x) '#:|firstName|)
(address-city (x) '#:|addressCity|)
(= (a b) `(= ,a ,b))
(block-and (a b) `(& ,(funcall a) ,(funcall b))))
(funcall predicate-thunk nil)))


Then (make-parse-tree-from-predicate #'some-predicate) yields (& (= #:|firstName| "John") (= #:|addressCity| "Austin")), which we can manipulate and then pass to a SQL query printer.

Here I implemented dynamically-scoped functions using unwind-protect, which is not as powerful (or, possibly, efficient) as the implementation presented in Costanza's paper, but is simpler (I also used the same trick to implement dynamically-scoped variables in Parenscript).

The property of the same Lisp code to mean different things in different contexts is called duality of syntax by Doug Hoyte in his excellent book Let Over Lambda (almost finished reading, promise to write a review soon). Lisp offers this property both at run-time (via late-binding and closures) and at macro-expansion time (via homoiconicity and the macro-expansion process itself).

Another technique from Let Over Lambda illustrated in the above code is the recursive macro. This one is a personal favorite of mine; I find that the iterative simplification that recursive macros express provides very clean and maintainable code.

This code also provides examples of the two problems that the closure-oriented metaprogramming approach encounters in Common Lisp:

The first is the fact that we had to shadow = in our package. Common Lisp forbids the redefinition of the functions, macros and special forms defined in the standard, so we have to go out of our way if we want to achieve that effect. Barry Margolin provided a rationale for this in comp.lang.lisp post.

The second is the fact that Common Lisp has so many special forms and macros - and just happens to be one of them. Smalltalk avoids this problem by doing virtually everything via message passing and closures. In Common Lisp we don't have this straightjacket, but we also don't have this luxury of assuming that everything is an object or a closure.

Another Common Lisp feature that might break this example is function inlining (then again, I did just write about the benefits of late-binding...).

April 8, 2009

Masterminds of Programming

Today my copy of Masterminds of Programming arrived in the mail from O'Reilly; a small reward for giving a lightning talk at ILC.

The book is a series of interviews with programming language designers. Along with some expected atrocities like Stroustrup on C++ and Gosling on Java, and bizarre ones such as a 50-page interview with Jacobson, Rumbaugh and Booch on UML, there are interviews with Adin Falkoff on APL, Moore on Forth, Wall on Perl, and a few others. The functional camp is well-represented with SPJ, Hudak, Wadler, and Hughes interviewed about Haskell, and Milner giving an interview about ML.

It is telling that right at the preface the book starts off with an urban legend: "children can learn foreign languages much more easily than adults."

Some of the interviews are very revealing. The discussions present an entertaining window on the cavalier attitudes and biases of many programming language designers, which helps explain some of the dysfunction of the software world today. I don't think this is what the editors intended, but it makes for hilarious reading material.

Some of the interviews can be frustrating to read (every third question in Falkoff's APL interview seems to boil down to "lolz funny syntax"); thankfully this is balanced out by absolutely delightful ones such as Moore on Forth (IMO, the highlight of the book), and the Objective-C interview with Brad Cox and Tom Love. Overall the quality of the interviews varies widely, but not surprisingly mostly seems to correspond to the quality of the language being discussed.

Ultimately Masterminds of Programming is worthwhile not for its insights into programming language design (most of which unsurprisingly boil down to "oops I made a bunch of mistakes because I didn't start with a good model/think/know any better/know enough math"), but into programming and computing history in general.

To finish this review where it started off, here is another unintentionally amusing bit of insight from the preface:

Imagine that you are studying a foreign language and you don't know the name of an object. You can describe it with the words that you know, hoping someone will understand what you mean. Isn't this what we do every day with software?

It comes as no surprise that there is not a single entry for either "macro" or "metaprogramming" in the book's index (although Wadler does make a passing mention of Lisp macros in the Haskell interview).

November 22, 2008

Code

I put up some new code on the aptly titled section of my website:


http://vsedach.googlepages.com/code.html

Included are Jaro-Winkler and Levenshtein string similarity distance algorithms. Levenshtein is a general algorithm based on insertions/deletions/substitutions, while Jaro-Winkler is a more tweaked implementation specifically suited to short strings such as names. One area where the latter comes in handy is denormalizing manually entered records where for example salespersons' names may not be consistently entered. I found that Jaro-Winkler works best if you add the distance of the last name and the first name separately while giving the last name greater weight.



Also included are implementations of sparse vectors, and radix trees (which I blogged about before).

November 9, 2008

Compile-time intra-application URI link checking

Here is a neat hack I came up with to do compile-time intra-application link checking for a web application that I wrote. As you might expect the mechanism is based on eval-when facility of CL, but also uses the *compile-file-pathname* and *load-pathname* variables to provide the names of the files where the offending links reside.



The ASDF definition of the application looks like:




(asdf:defsystem :cct
:serial t
:components ((:file "resource-definition")

;; other files

;; link checker (goes last)
(:file "uri-reference-checker"))


Where resource-definition.lisp defines the page-definition and link-reference macros:




(in-package :cct)

(eval-when (:compile-toplevel :load-toplevel)
(defparameter *defined-uri-list* ())
(defparameter *referenced-uri-list* ()))

(defmacro/ps resolve-resource (resource-identifier)
(pushnew (cons resource-identifier (or *compile-file-pathname* *load-pathname*)) *referenced-uri-list*)
(symbol-to-uri resource-identifier))

(set-dispatch-macro-character #\# #\/
(lambda (stream subchar arg)
(declare (ignore subchar arg))
(let* ((base-uri
(with-output-to-string (collector)
(loop until (member (peek-char nil stream nil #\Space t) '(#\Space #\Newline #\Tab #\? #\) #\{)) do
(princ (read-char stream) collector))))
(page (read-from-string base-uri)))
`(concat-url (resolve-resource ,page) ,@(uri-template:read-uri-template stream)))))

(defmacro concat-url (&rest fragments)
`(format nil "~@{~A~}" ,@fragments))

(defpsmacro concat-url (&rest fragments)
`(+ ,@fragments))

(defmacro define-page (page-name (&key parameters (default-request-type :both)) &body body)
(flet ((process-parameter (p) (if (atom p) p (list (first p) :parameter-type (list 'quote (second p))))))
`(progn
(eval-when (:compile-toplevel :load-toplevel :execute)
(pushnew '(,page-name) *defined-uri-list*))
(define-easy-handler (,page-name :uri ,(symbol-to-uri page-name) :default-request-type ,default-request-type)
,(mapcar #'process-parameter parameters)
(if (and ,@(mapcar (lambda (x) (if (atom x) x (car x))) parameters))
(progn ,@body)
(redirect "/cct"))))))


The link-reference mechanism is implemented as a macro character which builds on the uri-template facility. You certainly don't need to do it this way, but I find URI templates to be quite convenient. The only thing I don't like is the special-casing of the termination symbols (whitespace, closing paren). If you know of a better way, please let me know.



Also note the defpsmacro - this is a Parenscript macro definition, which lets the same link-checking mechanism (and uri-template, which is Parenscript-compatible) work with Parenscript code, which is transformed into JavaScript and can then construct (compile-time checked) URIs dynamically in the browser.



uri-reference-checker.lisp runs after all the page definitions have been made and consists of:




(in-package :cct)

(eval-when (:compile-toplevel :load-toplevel)
(dolist (unreferenced-uri (set-difference *referenced-uri-list* *defined-uri-list* :key #'car))
(warn "Reference warning: referencing unknown URI resource ~a in file ~a" (car unreferenced-uri) (cdr unreferenced-uri))))


You could also provide warnings for defined URIs that have no references.



The actual application code then looks something like:




(loop for date in dates do
(htm (:li (:a :href #/bank-rec-report?date={date} (str date)))))


This is a pattern that I think can be applied to most web applications.

September 30, 2007

Learning to Lisp

Answering to Peter Seibel's appeal for a Google bombing, here are two links, one for a Lisp tutorial, the other for an illustrated Lisp tutorial. I am not just providing these for the sake of altruism, though. It's part of my insidious campaign of making Lisp programmers instead of trying to find them. Due to a particular set of circumstances that has developed recently, I'm putting together a team of web application developers that will do fun things that involve Hunchentoot and Parenscript. If you're in Calgary and would like to get involved, get in touch. If things go according to plan, we will be looking for people outside of Calgary in a few months as well.