Blog

Don't Spare the Low-Level Details

Abstractions are wonderful things, that is until they leak. At that point, I tend to wish someone didn’t spare me the low-level details.

Recently, I was tasked with developing a system that continually logs temperature readings from 12 hotplates. The plates’ use an RS232 communication interface, which is very easy to negotiate.

With only those high-level details available, I declared that the logging software would be an “afternoon in n’ out job” and created an appropriate design for the timescale / effort:

  • Hard-code the 12 plates’ COM/ttyS port identifiers and output paths
  • Loop through each port/path pair
  • Send the GET_TEMPERATURE request
  • Write the response to the output file

Job done.

Port identifiers can change.

Wait, who are you?


Turn any Long-Running Command-Line Application into a FIFO Server

I’ve been using a web scraper—named scrape-site, for the sake of this blog post—that takes around 5 minutes to recursively scrape a website. During one of my scrape sessions, I’ll continually look for more sites to scrape. Because it would be annoying to wait, I’d like to be able to immediately queue any site I find; however, scrape-site is just a plain-old command-line application. It wasn’t designed to support queueing.

If scrape-site was a UI-driven commercial product, I’d be furiously writing emails of displeasure to its developers: what an oversight to forget a queueing feature! Luckily, though, scrape-site only being a single-purpose console application is its biggest strength: it means that we can implement the feature ourselves.


IECapt for Corporate Website Slideshows

Big companies tend to use a variety of webapps to show their news, stats, and announcements. Some locations in the company—usually, the tearoom—might contain displays showing some of that material. A clean way to automate these displays might be to use a relevant API or script for each webapp. However, this assumes two things: the site has an API, and you have enough time to use it.

Another approach is to automatically take screenshots of the webapp. A little dirtier, but much easier to implement and much more amenable to change. Here, I’ve written up that approach.


Netcat: The best tool for playful backdooring

Just because it made me giggle so much, I thought I’d write up a classic shell prank: pumping messages into someone else’s shell terminal.

If you can ssh into the computer, then you can write messages to other user’s terminal with wall:

adams-ssh-account@computer $ echo "You suck!" | wall
target-user@computer $

Broadcast Message from adams-ssh-account@computer                                      
        (/dev/pts/1) at 21:48 ...                                              
                                                                               
You suck!

However, that’s making it far too easy for the target. The message itself gives the game away! Also, you’ll need an account on the target computer, which means you’ll have to get sudo access. People might leave their computers unlocked but it’s unlikely they’ll have root (at least, without you knowing their password).


Fakeonium

I work with large research data systems. One of those systems—lets call it Choogle, for the sake of this post—is nearly two decades old, which is practically forever in the IT world, which is impressive. Choogle has been around so long that much of the lab’s analysis equipment is tightly integrated with it. For example, a researcher can enter a Choogle ID into an analysis instrument to automatically link their analysis with the sample’s history. This is neat, provided the researcher incorporates Choogle as a central component of their workflow.

From a top-down viewpoint, making researchers submit their sample’s information to Choogle is a better situation than each researcher having a collection of loosely formatted labnotes. Designing lab equipment to require Choogle is a way of encoraging conversion, which is the intention.

What happens, though, if researchers don’t particularly want to use Choogle? Maybe they’re already incorporated a similar (non-Choogle) research system, or maybe they just don’t like the UI. When those researchers want NMR plots, the Choogle requirement becomes a barrier.


Pretty Molecules

I have created a few scientific journal covers and renders. Eager to make their own designs, a few colleagues asked me about the process. It’s not a super fancy process and was refined out of a designs I’ve done over the years. The process attempts to go “from nothing to done” while accounting for changing requirements, rollbacks, and tweaks.


Complicated HTTP APIs

I occasionally have to write HTTP clients to handle third-party APIs. One thing that really bugs me is when useful HTTP APIs have additional custom authentication and encryption. Custom encryption is especially annoying when SSL could’ve been used instead.


Card bingo

This writeup illustrates how almost anything can become a project if you read into it too much.

A few weeks ago, I was in the pub playing a very simple bingo game. The game works as follows:

  • Each player receives two random and unique playing cards from each suit to form a hand of eight cards
  • A host sequentially draws cards from a randomly shuffled deck, announcing each card as it is drawn
  • The first player to have all eight of their cards announced by the host wins

I was terrible at the pub quiz, so I decided to focus my mental efforts on two seemingly simple questions about the game, which eventually led to me getting ahead of myself:

  • What’s the probability of having a “perfect” game. That is, a game where you win after the 8th card is announced?

  • In a game containing n players, how many cards are called out by the announcer before somone wins?

I thought I’d get my answer in under ten minutes but it took a little longer than that.


My future self appreciates a simple codebase

At the moment, I regularly have to develop Treegrid UI components. It’s been quite a lesson in API design and made me realize how design patterns exist to create great architecture, not a great API.

My recent work focuses on making treegrid components work with Crown’s backend. The backend contains a variety of data structures which, while being quite diverse, need to be manipulated through a common programmatic interface. To that end, I designed and implemented an adaptor, DDSTreeGrid, around EasyUI’s Treegrid component and a declarative data represenation of our backend views (view).


Language Agnosticism

I initially learnt javascript because I was desperate to have marquee effects on my Microsoft FrontPage website, actionscript to build menus in a basic flash game I tried to make, C++ for a half-life mod, and so on.

Jobs seem a little more focused than my approach. When I was jobhunting, most programming job postings were language- or framework-centric. They weren’t looking for someone generally experienced in full-stack web development. They wanted someone who specifically has at least 2 years of angularjs experience or specifically has Rails4 JSON API coding experience. I’m guessing this is a consequence of reality: commercially established applications are architected on—and have built technical debt in—a particular language or framework.