3D printing and other software curiosities
by Clinton Freeman

BK Precision 1550 Review

02 Aug 2014

So for a long while, powering prototypes has been a bit of a chore for me, I was constantly wrestling a desktop Kraken of cables. A giant mess crawling out of the umbilical of an old ATX computer power supply, and a couple of wall wart style USB chargers (for powering Arduinos, Phones or Raspberry Pis).

The BK Precision 1550 is a low-cost benchtop power supply that is capable of generating 1 to 36 volts, with an output maximum of 3 amps. It retails for about $135 on amazon. However people outside the U.S. need to be a little careful, the standard model only accepts 100-120V. There is a 220V model (compatible with the mains voltage here in Australia), you just need to hunt around for it, I picked mine up from Mouser Electronics.

Picture of BK Precision 1550 power supply.

I was pleasantly surprised when I unpacked the 1550, it has a really small footprint and is incredibly light. The industrial design of the faceplate is a direct riff on a black iPod Classic, featuring click wheel inspired control buttons and a large back lit LCD screen.

However, just about everywhere else the BK Precision 1550 was good enough, but seemed to agonisingly fall shy of being an amazing power supply. The team at BK had some great ideas in this one, targeting it at the maker/hobbyist/artist/hacker market, but was ultimately held back by small little details/niggles.

For example, the USB charging port drew my attention first. Hopefully I would be able to ditch a few of those wall warts. Unfortunately it has a max output of 0.5 amps. Enough to charge a phone, and run an Arduino. But not enough for the Raspberry Pi/Beaglebone Black. I would be doing my best Jim Cramer buy impersonation right now if BK had sprung for that extra amp and brought the output of the USB charger to 1.5 amps. Running a Pi off the charging port, and simultaneously being able to power a 12 volt circuit the Pi was controlling? Yes Please.

BK also didn’t include any display of the current going out on the charger port, no big deal really and it certainly won’t stop me from using the USB charging port. But getting a breakdown, and able to see total power usage across everything? Hells yes.

For a cheap power supply, it is also accurate enough at +/- 50mV. However, the controls only allow 100mV increments, and the LCD display also rounded the voltage output going out on the banana plugs. This effectively made it impossible for me to configure the the 1550 to read 12.0 volts. Set it to 11.9, and the old fluke gave a read out of 11.94V. Cool. Press the + button once to add (100mV) and the unit read 12.1, and the fluke 12.05 volts. No matter what I did, I couldn’t get the 1550 to land on 12.0v. The output of 11.9 and the 12.1 settings were close enough to 12 volts for my purposes, but it did trip my OCD out a little not being able to get the display to read what I wanted.

Instead of rounding the output voltage, BK could have just used the ceiling or floor values instead. This would create a greater feeling of control over the output, by being able to hit every 100mV increment between 1 and 36V with no appreciable loss in accuracy. Still 12.05 volts coming out, totally close enough to the 12.0 I am trying to get out of the unit. Why not let the LCD display say - hey this is 12.0 volts… Ish?

These are just tiny tiny little details that don’t make this a bad power supply at all. It is a solid unit that is vastly better than the kraken of crap that was sitting on my bench. Overall? 3.5/5.

3.5 stars out of 5.

 

How accurate are Estimote iBeacons?

07 Jul 2014

Hardware for my next project has started to arrive, and last week a preview kit of Estimote iBeacons landed on my doorstep. Packed like a fine box of Belgium chocolates, the Polish made iBeacons looked delicious.

Picture of estimote ibeacons in original packaging.

In site-specific theatre performances, I often use virtual beacons by nominating the coordinates of a target location in software (often to trigger audio or video) and then using GPS to work out the proximity of the participant to the virtual beacons. Naturally, using GPS in the measurements mean it is not very accurate and the proximity calculations could be out a far as 15 metres.

To test how the Estimotes stacked up, I wrote a little ranging test app with their SDK so I could measure the distance between a phone and one of their iBeacons.

I laid out a tape measure, put my phone (A Sony Xperia M2) at one end, and the beacon at the 1 metre mark. Took 30 distance readings from the test app, and repeated at 1 metre increments till running out of tape measure. *

Picture estimote, tape measure and phone in experimental setting.

I then used a common GPS accuracy measure, the Distance Root Mean Squared (DRMS) to work out the ranging accuracy. If you are after the raw measurements and stats, you can check out the full dataset and calculations here.

As hinted by the Estimote documentation, I was expecting precision to drop as distance between the phone and the beacon increased.

Chart showing estimote error envelope and how it changes over distance.

Which is exactly what I got, the further away (and the weaker the bluetooth signal), the less accurate the distance measurement became. The above results are probably a ‘best case’ scenario as well, I imagine obstacles and anything that would interfere with the bluetooth signal would lead to an even more radical drop-off in accuracy. You can see at 8 metres, the estimote gave a mean reading of about 7 metres, but could be off by at most 2 metres, with the calculated distance varing from 9 to 5 metres.

Estimote DRMS Results

Actual Distance   Mean Measured Distance   DRMS   
1.0m1.0m+/- 0.1m
2.0m2.2m+/- 1.1m
3.0m3.2m+/- 0.9m
4.0m3.3m+/- 0.9m
4.0m3.3m+/- 0.9m
5.0m3.6m+/- 1.6m
6.0m6.7m+/- 2.3m
7.0m6.0m+/- 1.7m
8.0m7.1m+/- 2.0m

What was startling is how accurate the Estimote is at very close ranges, around 10cm when a metre from the iBeacon. But even to be accurate within 2 metres at a decent distance from the iBeacon was vastly better than what I was getting with GPS and virtual beacons. So, I’m definitely going to continue pushing ahead with the Estimote on the next project.

* The broadcast power for the Estimote iBeacon was left at the factory default, (Weak, -12 dBm) which is good to about 15m. I only had 8m of tape measure, so it seemed like a pretty good place to start.

 

What are the best SD cards to use in a Raspberry Pi?

02 Jul 2014

SD cards have a limited life, and the more you read and write to them, the shorter their lifespan. In a Raspberry Pi this makes things a little tricky, the SD card gets a much tougher workout than it normally would in something like a digital camera.

A photo of a SanDisk extreme SD card installed in a Raspberry Pi.

Up until now, I have always just picked up a SD card that was sold bundled with a Raspberry Pi. Unfortunately these SD cards are often low quality and didn’t last very long, with some failing in as little as a month.

So now I get SD cards separately, and only ones that feature wear levelling. The cheap SD cards don’t have any wear levelling, and the Raspberry Pi gets into situations where certain areas on the SD card gets written to over and over again until it wears out and fails. The Pi then comes along and again tries to the use the same worn-out area, and promptly chokes. All despite other areas of the SD card being hardly (or not) used at all!

The more expensive cards with wear levelling won’t just keep pummelling the same spot on the disk over and over again. Instead, it will try and spread wear out over the whole disk. A little like rotating the tyres on a car, wear levelling ensures that each part of the disk decays at about the same rate.

An Illustration showing difference in SD card deterioration with and without wear levelling

I also get SD cards that have way more space than I need, at least 8GB. This further increases the longevity of the card by increasing the total ‘surface area’ that will eventually wear away. With wear levelling, more free space means a longer lasting SD card.

The following SD-Cards all feature some form of wear levelling and won’t break the piggy bank:

However, if you have more than $150 to burn, you can reach to the very top shelf and have a browse around Panasonic’s industrial grade SD cards (some feature RAID for even greater data protection).

The impressive range and features of the Panasonic industrial range has me a bit smitten for the Panasonic gold series. Some of that industrial goodness has to be rubbing off on the top-end of their consumer line, right?

For a host of other tips and tricks on how to extend the life of a SD card inside a Raspberry Pi, see this excellent thread on StackExchange.

 

How to Bootstrap a Raspberry Pi from your Laptop

02 Jun 2014

Raspberry Pi’s are cheap, and I really like the Raspbian operating system, I can work in just about any programming language I want, and I can configure it from my laptop with standard networking gear. No extra keyboards, monitors or serial adapters needed.

Jack a cable from the Pi’s ethernet port straight into your network router, insert a SD card prepared with Raspbian and power up the Raspberry Pi. 98% of the time, your network will automatically assign the Pi with an ip address (the other 2% being cases where you have explicitly configured your network otherwise).

Picture of Raspberry Pi plugged into ethernet

Next, fire up a network analysis tool and find the address that was automatically assigned to the Pi. For me, I use the excellent Angry IP scanner (open source) to scan my network and get an address list of all the devices that are currently connected.

screenshot of Angry IP scanner

Now that you know the address of your Pi, you can crank open a terminal and SSH into it with the default credentials that comes with Raspbian:

laptop$ ssh pi@10.1.1.5

(The default password is raspberry)

Now that we are in, the Raspberry Pi can be customised for your project. I upload and run a little script that installs the tools I use (OpenCV, vim and Go), it also ensures that the operating system is set to handle regular USB webcams and finally configures the timezone.

#!/usr/bin/env bash

# Update aptitude
sudo apt-get update
sudo apt-get upgrade -y

# Install opencv & vim.
sudo apt-get install -y libopencv-dev
sudo apt-get install -y vim

# Ensure webcam module is loaded with necessary settings.
sudo rmmod uvcvideo
sudo modprobe uvcvideo nodrop=1 timeout=5000 quirks=0x80

# Download golang
if [ ! -f go1.2.2.linux-arm~multiarch-armv6-1.tar.gz ]; then
	wget http://dave.cheney.net/paste/go1.2.2.linux-arm~multiarch-armv6-1.tar.gz
fi

# Install golang
sudo tar -C /usr/local -xzf go1.2.2.linux-arm~multiarch-armv6-1.tar.gz
echo "export PATH=$PATH:/usr/local/go/bin" >> /home/pi/.bashrc
source /home/pi/.bashrc

# Configure the local timezone.
sudo bash -c 'echo "Australia/Brisbane" > /etc/timezone'
sudo dpkg-reconfigure -f noninteractive tzdata

Often I will tweak this base configuration with some project specific changes, which will customise the networking setup and tweak /etc/rc.local to run a custom application when the Raspberry Pi is booted.

Uploading the bootstrapping script from your laptop is pretty straight forward with scp:

laptop$ scp bootstrap.sh pi@10.1.1.5:.

Running the script on the Pi is simply a case of:

pi@raspberrypi$ ./bootstrap.sh

You can pick up a Raspberry Pi from Amazon and when coupled with an Arduino it is a very flexible platform to power all those creative computing projects you are cooking up.

 

Non-blocking control of stepper motors on Arduino

29 Apr 2014

Stepper motors are ideal for 3D printers, robots, mills and lathes; you can program them to rotate by very precise amounts. Push the right signal (“I will have 36 degrees please”) into the motor driver and it will spin or ‘step’ by the nominated amount.

Wait! I know I just said signal, but this won’t devolve into a mind melting Fourier Transform nightmare, I promise. In fact, drivers such as the ’EasyDriver’ by Sparkfun, or the ’A4988’ by Pololu have a step pin that can be wired to a digital pin on an Arduino. This pin can be used to ‘twitch’ a stepper motor by single steps. Just set the pin to high, and your stepper motor will rotate by either 1.8 or 0.9 degrees (depending on the step size of your stepper motor).

image

The code for this is pretty straight forward:

#define STEP_PIN 3

void setup() {
  pinMode(STEP_PIN, OUTPUT);
  digitalWrite(STEP_PIN, HIGH);
}

void loop() {
}

To smoothly drive a stepper motor, just set the pin high, wait a little while, set it back low again and repeat, like so:

#define STEP_PIN 3

void setup() {
    pinMode(STEP_PIN, OUTPUT);
}

void step(long stepDelay) {
    digitalWrite(STEP_PIN, HIGH);
    delayMicroseconds(stepDelay);
    digitalWrite(STEP_PIN, LOW);
    delayMicroseconds(stepDelay);
}

void loop() {
    // rotate by 100 steps.                
    for (int i = 0; i < 100; i++) {
        step(200);    
    }
}

By the way, you just generated a signal - a 6.25Khz square wave. It looks like this:

image

To spin the motor faster, just decrease the delay between the high and low pulse, i.e. change step(200) to step(80). This increases the frequency of the square wave and the driver spins the motor faster. To spin the stepper motor slower, just increase the delay between the high and low pulse, i.e. change step(200) to step(400). This decreases the frequency and the driver spins the motor slower.

image

The problem with the example code above is that it blocks - your Arduino can’t do anything else while it is generating the signal for the stepper driver. You have to wait till the motor has rotated by the desired amount, before your Arduino is free to do something else. This is usually fine for small movements, but can be a bigger problem over longer distances.

I ran into this longer distance problem when building a robotic control system for an artwork that Keith Armstrong was creating. The piece features a linear rail (a few metres long) and a drive system powered by a stepper motor allowing the robot to crawl from one end to the other. It takes a few minutes for the robot to complete its end-to-end journey and all the while, it needed to be doing things in between.

image

For my first attempt at non-blocking stepper control, I shifted the step control directly into the Arduino loop, and wrapped it with some checks to see if we had gotten to our desired location. I also did the other things that needed to get done in the same main loop, a little bit like this:

#define STEP_PIN 3

void setup() {
    pinMode(STEP_PIN, OUTPUT);
}

void step(long stepDelay) {
    digitalWrite(STEP_PIN, HIGH);
    delayMicroseconds(stepDelay);
    digitalWrite(STEP_PIN, LOW);
    delayMicroseconds(stepDelay);
}

void loop() {
    DoOtherStuff();

    if (!thereYet()) {
        step(200);
    }
}

It was a clunky disaster. The motor rattled and drove very poorly. The problem was that ‘DoOtherStuff’ took a variable amount of time to complete. Sometimes it would take 183 microseconds, sometimes 191. It doesn’t sound (or look) like a lot, but those subtle timing differences resulted in a noisy square wave (see below), which was enough to make the motor rattle.

image

The trick to solving this problem was to ‘hide’ the computation of the other stuff in part of the wave. Rather than just calling delayMicroseconds and wasting ATMega cycles, I put them to good use executing ‘DoOtherStuff’. I then added some padding to ensure that the low of the wave was a consistent length each time.

#define STEP_PIN 3

unsigned long t = 0;

void setup() {
    pinMode(STEP_PIN, OUTPUT);
}

void step(long stepDelay) {
    digitalWrite(STEP_PIN, HIGH);
    delayMicroseconds(stepDelay);
    digitalWrite(STEP_PIN, LOW);

    unsigned long dt = micros() - t;
    if (dt < STEP_DELAY_SHOW) {
      delayMicroseconds(STEP_DELAY_SHOW - dt);
    }

    t = micros();
}

void loop() {
    DoOtherStuff();

    if (!thereYet()) {
        step(200);
    }
}

image

The only downside was that this imposed a limit to how fast the motor could be driven. The time it took for ‘DoOtherStuff’ to perform had to ‘fit’ in the square wave, meaning the frequency couldn’t be higher than that required by the computation.

The end result was a much smoother drive for the stepper motor, and an Arduino sketch that could also do other things at the same time.

The full example of non-blocking stepper control can be found on Github.

EDIT: Also found an post on driving stepper motors directly from a Raspberry Pi. It would be interesting to see how it compares with the Arduino.

 

Image Stabalised Motion Detection

28 Jan 2014

The Gasworks project is an interactive art installation I’ve been involved with, which loosely mimics brain cells as clusters of lights. Webcams are used to detect motion and organically alter lighting sequences of ten different sculptures (or neurones), each suspended on steel cabling above a public amphitheatre.

image

Artist Michael Candy wanted the installation and lighting sequence to look as analog as possible, with the whole thing to reacting according to the speed of movements detected. Unfortunately this ruled out using the simple PIR (passive infrared) sensors typically found in security systems; these have a single output pin that is either off (no motion) or on (motion detected), and weren’t capable of giving us any insight into the amount of activity associated with any detected motion.

I eventually settled on using a webcam and a computer vision algorithm called optical flow, a technique often found in optical computer mice. I used the implementation found in OpenCV, which was really easy to integrate into Golang with cgo.

Optical flow returns back an array of vectors, one for each pixel captured by the webcam. The magnitude and direction of these vectors indicate how that pixel “moves” compared to the previous frame in the video stream.

image

To calculate the magnitude of a detected movement, I simply summed all the movement vectors that came out of optical flow algorithm, calculated the overall length (magnitude) and scaled it down so that frames with loads of movement had an ‘energy’ of 0.1 while frames with no movement had an ‘energy’ of 0.0.

func calcDeltaEnergy(flow *C.IplImage, config *Configuration) float64 {
    var i C.int
    var dx, dy float64

    // Accumulate the change in flow across all the pixels.
    totalPixels := flow.width * flow.height
    for i = 0; i < totalPixels; i++ {
            value := C.cvGet2D(unsafe.Pointer(flow), i/flow.width, i%flow.width)
            dx += math.Abs(float64(value.val[0]))
            dy += math.Abs(float64(value.val[1]))
    }

    // average out the magnitude of dx and dy across the whole image.
    dx = dx / float64(totalPixels)
    dy = dy / float64(totalPixels)

    // The magnitude of accumulated flow forms our change in energy for the frame.
    deltaE := math.Sqrt((dx * dx) + (dy * dy))
    fmt.Printf("INFO: f[%f] \n", deltaE)

    // Clamp the energy to start at 0 for 'still' frames with little/no motion.
    deltaE = math.Max(0.0, (deltaE - config.MovementThreshold))

    // Scale the flow to be less than 0.1 for 'active' frames with lots of motion.
    deltaE = deltaE / config.OpticalFlowScale

    return deltaE
}

It was here that we ran into a little problem. The sculptures are suspended on steel cable rigging, and sway in the wind. The algorithm was getting confused, and a gentle sway in the wind would be falsely detected as people moving around, thus changing the lighting sequence.

I first tried sticking an accelerometer to the webcam, and using readings from that to compensate for when the camera was swaying in the wind. This turned out to be a “Bad Idea”™, mostly because of the latency between getting accelerometer sensor readings and matching it with the right frame of video. But also it added a considerable amount of complexity, and needless to say everyone was relieved when I worked out a software approach that didn’t need any additional hardware.

I realised that when the webcams and sculptures are still, only parts of the image have vectors, indicating detected motion. However, when the sculptures and cameras are swaying in the wind, the whole image has vectors, indicating a general trend - the direction in which the camera is moving.

image

To work out the general direction in which the camera was moving, and to image stabilise the optical flow algorithm, I first worked out the mean movement vector for the frame and subtracted that from each movement vector (clamping at zero).

func calcDeltaEnergy(flow *C.IplImage, config *Configuration) float64 {
    var i C.int
    var dx, dy, mx, my float64

    totalPixels := flow.width * flow.height

    // Determine mean movement vector.
    for i = 0; i < totalPixels; i++ {
            value := C.cvGet2D(unsafe.Pointer(flow), i/flow.width, i%flow.width)
            mx += float64(value.val[0])
            my += float64(value.val[1])
    }
    mx = math.Abs(mx / float64(totalPixels))
    my = math.Abs(my / float64(totalPixels))

    // Accumulate the change in flow across all the pixels.
    for i = 0; i < totalPixels; i++ {
            // Remove the mean movement vector to compenstate for the sculpture that might be swaying in the wind.
            value := C.cvGet2D(unsafe.Pointer(flow), i/flow.width, i%flow.width)
            dx += math.Max((math.Abs(float64(value.val[0])) - mx), 0.0)
            dy += math.Max((math.Abs(float64(value.val[1])) - my), 0.0)
    }

    // average out the magnitude of dx and dy across the whole image.
    dx = dx / float64(totalPixels)
    dy = dy / float64(totalPixels)

    // The magnitude of accumulated flow forms our change in energy for the frame.
    deltaE := math.Sqrt((dx * dx) + (dy * dy))
    fmt.Printf("INFO: f:%f m:[%f,%f]\n", deltaE, mx, my)

    // Clamp the energy to start at 0 for 'still' frames with little/no motion.
    deltaE = math.Max(0.0, (deltaE - config.MovementThreshold))

    // Scale the flow to be less than 0.1 for 'active' frames with lots of motion.
    deltaE = deltaE / config.OpticalFlowScale

    return deltaE
}

It took a bit of tweaking, but in the end, the stabilising approach worked great and compensates for all but the most violent gusts of wind. The structural engineers have predicted the sculptures will experience 80cm of lateral movement in 100km/h wind gusts (a 1 in 5 year storm event). I’m actually really keen to see how the sculptures sense and react to a big subtropical storm - I reckon it would be a pretty awesome light show!

 

Using Golang to connect Raspberry PIs and Arduinos over serial

12 Jan 2014

The code running on Raspberry PI’s within the gasworks project (An art installation that loosely mimics brain cells as clusters of lights) is all written in Golang. While, the hardware architecture for each of the neurones has a Raspberry PI sending commands to an Arduino over serial. This communication link was one of the first things I prototyped for the project.

image

The veritable Dave Cheney maintains unofficial ARM builds of Go which are compatible with the Raspberry PI. So the first step is to grab that, and follow the along with the Golang installation instructions.

For serial communication I used the huin fork of the goserial library, mainly because the code had a far more idiomatic Go style than the others I looked.

Opening up a connection to the Arduino is a case of hunting around for that USB device that is most likely the Arduino:

package main

import (
	"github.com/huin/goserial"
	"io/ioutil"
	"strings"
)

// findArduino looks for the file that represents the Arduino
// serial connection. Returns the fully qualified path to the
// device if we are able to find a likely candidate for an
// Arduino, otherwise an empty string if unable to find
// something that 'looks' like an Arduino device.
func findArduino() string {
	contents, _ := ioutil.ReadDir("/dev")

	// Look for what is mostly likely the Arduino device
	for _, f := range contents {
		if strings.Contains(f.Name(), "tty.usbserial") ||
			strings.Contains(f.Name(), "ttyUSB") {
			return "/dev/" + f.Name()
		}
	}

	// Have not been able to find a USB device that 'looks'
	// like an Arduino.
	return ""
}

func main() {
	// Find the device that represents the arduino serial
	// connection.
	c := &goserial.Config{Name: findArduino(), Baud: 9600}
	s, _ := goserial.OpenPort(c)
}

The thing that tripped me up when prototyping the communication code was that I wasn’t able to immediately pump data down the serial connection to the Arduino, unless I had the Arduino serial monitor open.

When making a serial connection to an Arduino it automatically (unless it is a new Arduino Leonardo) resets it (similar to what happens when you press the reset button). It then takes about a second for the bootloader on the Arduino to do it’s thing and get into a state where it is able to accept data over the serial port.

I worked around this in Golang a little inelegantly by sleeping for a second, however it is possible to disable the Arduino reset on serial connection with a simple hardware hack.

func main() {
	// Find the device that represents the Arduino serial connection.
	c := &goserial.Config{Name: findArduino(), Baud: 9600}
	s, _ := goserial.OpenPort(c)

	// When connecting to an older revision Arduino, you need to wait
	// a little while it resets.
	time.Sleep(1 * time.Second)
}

The communication protocol I used between the Raspberry PI and Arduino was very simple. Each command is five bytes, the first byte being the command identifier, with the four remaining bytes reserved for a single mandatory float argument (that could be ignored if necessary on the Arduino).

Packaging up commands and sending them over the wire was pretty easy with the binary encoding package bundled into the Golang standard library. It was a case of encoding the argument into a byte buffer, then looping over the bytes in both the command and argument byte buffer, writing them to the serial port:

// sendArduinoCommand transmits a new command over the nominated serial
// port to the arduino. Returns an error on failure. Each command is
// identified by a single byte and may take one argument (a float).
func sendArduinoCommand(
command byte, argument float32, serialPort io.ReadWriteCloser) error {
	if serialPort == nil {
		return nil
	}

	// Package argument for transmission
	bufOut := new(bytes.Buffer)
	err := binary.Write(bufOut, binary.LittleEndian, argument)
	if err != nil {
		return err
	}

	// Transmit command and argument down the pipe.
	for _, v := range [][]byte{[]byte{command}, bufOut.Bytes()} {
		_, err = serialPort.Write(v)
		if err != nil {
			return err
		}
	}

	return nil
}

Putting it all together within the main function becomes:

func main() {
	// Find the device that represents the arduino serial connection.
	c := &goserial.Config{Name: findArduino(), Baud: 9600}
	s, _ := goserial.OpenPort(c)

	// When connecting to an older revision Arduino, you need to wait
	// a little while it resets.
	time.Sleep(1 * time.Second)
	sendArduinoCommand('a', 1.0, s)
}

Picking this data up on the Arduino side of the serial connection is done by reading the the first command byte and then using a union to decode the four argument bytes back into a float:

typedef struct {
  char instruction; // The instruction that arrived by serial connection.
  float argument;   // The argument that came with the instruction.
} Command;

/**
 * ReadCommand sucks down the lastest command from the serial port,
 * returns {'*', 0.0} if no new command is available.
 */
Command ReadCommand() {
  // Not enough bytes for a command, return an empty command.
  if (Serial.available() < 5) {
	return (Command) {'*', 0.0};
  }

  union {
	char b[4];
    float f;
  } ufloat;

  // Read the command identifier and argument from the serial port.
  char c = Serial.read();
  Serial.readBytes(ufloat.b, 4);

  return (Command) {c, ufloat.f};
}

Now, just make sure you set the same baud rate on the Arduino side of the connection, and start reading off commands from the serial connection:

/**
 * Arduino initalisation.
 */
void setup() {
  Serial.begin(9600);
}

/**
 * Main Arduino loop.
 */
void loop() {
  Command c = ReadCommand();

  // Do something awesome with the command. Like represent the state of a
  // simulated neurone as a lighting sequence.
}

For a full example of how this all works, you can checkout the RaspberryPI code and Arduino code for the Gasworks project on github. Enjoy!