Google I/O 2013 – Cognitive Science and Design, and how it applies to Android apps

[youtube https://www.youtube.com/watch?v=z2exxj4COhU&w=560&h=315]

This is an excellent talk by Alex Faaborg at Google I/O 2013 about cognitive science principles and how they apply to interface design. Here’s a summary of some of the main points and how they could be used to improve your apps:

  • We can search for objects of the same colour much faster than searching for objects of the same shape [18:26]
  • We can scan a group of faces for one we recognise in parallel rather than sequentially. This could be taken advantage of in messaging and address book apps, for example [10:13]
  • Objects in our periphery are recognised much faster than in our frontal field (tiger example in the video). You can put a small notification icon in the corner of the screen away from the user’s focal point and it will still be noticed [6:50]
  • Colour-deficiency: you can get away with using green and red as long as the contrast is significantly different. Best approach is to test your interface with filtering tools to see how it would actually look (e.g. Photoshop) [13:50]
  • Our brains are very good at recognising patterns. It’s not necessary to group objects together in a box, just having whitespace between groups will do [3:24]
  • You’ll recognise a silhouette of an object that just shows its basic geometry faster than you will recognise a more photo-realistic depiction of the object. This principle is used in the Holo icon set [9:10]
  • Notifications/interruptions wipe the contents of our working memory and make us lose the state of “creative flow” if we were in it. Takeaway: use notifications carefully [22:22]
  • “Chunking” optimizes for our working memory. Examples are the groups of digits in credit card and phone numbers. Make sure your interface supports these chunks and ignores user-entered whitespace! [21:17]
  • We make trust decisions quickly and once made they are slow to change, even to the point of us explaining away new information that goes against them. First impressions matter – make sure you have a quality application icon [24:16]
  • You don’t *have* to be consistent with existing interfaces and interaction paradigms when designing your app. Combining innovation with teaching the user (e.g. with a quick example video) can work well. Example: collaborating on documents via email attachments vs. using Google Docs [31:21]

Android: 9 patching a family of images the easy way

9 patch images in Android are great but if you happen to have a family of graphics to convert, it can get pretty tedious. I had a collection of button graphics that needed converting to 9 patches using the same stretchable regions.

Rather than do it all by hand with Photoshop or GIMP (and inevitably need to redo them all again later when something needed changing) I wrote a small BASH script to do it.

To use the script, first use the draw9patch tool to create the 9 patch info for one of your graphics – this will become the template. Once you’re done, go:

[code language=”bash” light=”true”]
./9batch.sh template.9.png button2.png button3.png …
[/code]

to copy the 1 pixel border from the template to your remaining graphics and save a .9.png version of each of them.

Note that you’ll need to install ImageMagick to use the 9batch script:

[code language=”bash” light=”true”]
sudo apt-get install imagemagick
[/code]

Apparently WordPress won’t let me upload the script itself so here’s the source code:

[code language=”bash” light=”true” collapse=”true”]
#!/bin/bash

if [ "$#" -lt 2 ]; then
echo "Usage: 9batch.sh template image1 image2 …" >&2
echo
echo "Applies 9 patch info to a family of images using one image as the template" >&2
echo "Template image should be 2 pixels wider and higher than source images" >&2
exit 1
fi

# 9 patch image to use as template
src=$1

for i in ${@:2}
do
# use sed to change extension from .png to .9.png and assign result to ‘out’
out=`echo $i | sed -e ‘s:\(….\)$:.9\1:’`
composite -gravity center $i $src $out
done
[/code]

Android Device Nudge Detection Helper Class

I recently added a feature to StarCraft 2 Build Player to start playing build orders when the users’ phone is nudged. The idea is so you don’t have to waste precious seconds looking down at your phone to tap the “Play” button, instead you can just mindlessly bump your phone on your desk and you’re off.

Anyway, it turned out to be pretty easy to factor this into a reusable class so here it is:

[sourcecode language=”java”]
package com.kiwiandroiddev.sc2buildassistant;

import java.util.ArrayList;

import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Handler;

/**
* Class for reporting when the device’s acceleration (excluding gravity) exceeds
* a certain value. Compatible with all Android versions as it uses Sensor.TYPE_ACCELEROMETER
* rather than Sensor.TYPE_LINEAR_ACCELERATION.
*
* NudgeDetector objects are initially disabled. To use, implement
* the NudgeDetectorEventListener interface in your class, then register it
* to a new NudgeDetector object with registerListener(). Finally, call
* setEnabled(true) to start detecting device movement. You should add a call
* to stopDetection() in your Activity’s onPause() method to conserve battery
* life.
*
* @author kiwiandroiddev
*
*/
public class NudgeDetector implements SensorEventListener {

private ArrayList<NudgeDetectorEventListener> mListeners;
private Context mContext;
private SensorManager mSensorManager;
private Sensor mAccelerometer;
private boolean mEnabled = false;
private boolean mCurrentlyDetecting = false;
private boolean mCurrentlyChecking = false;
private int mGraceTime = 1000; // milliseconds
private int mSampleRate = SensorManager.SENSOR_DELAY_GAME;
private double mDetectionThreshold = 0.5f; // ms^-2
private float[] mGravity = new float[] { 0.0f, 0.0f, 0.0f };
private float[] mLinearAcceleration = new float[] { 0.0f, 0.0f, 0.0f };

/**
* Client activities should implement this interface and register themselves using
* registerListener() to be alerted when a nudge has been detected
*/
public interface NudgeDetectorEventListener {
public void onNudgeDetected();
}

public NudgeDetector(Context context) {
mContext = context;
mListeners = new ArrayList<NudgeDetectorEventListener>();
mSensorManager = (SensorManager) mContext.getSystemService(Context.SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
}

// Accessors follow

public void registerListener(NudgeDetectorEventListener newListener) {
mListeners.add(newListener);
}

public void removeListeners() {
mListeners.clear();
}

public void setEnabled(boolean enabled) {
if (!mEnabled && enabled) {
startDetection();
} else if (mEnabled && !enabled) {
stopDetection();
}
mEnabled = enabled;
}

public boolean isEnabled() {
return mEnabled;
}

/**
* Returns whether this detector is currently registered with the sensor manager
* and is receiving accelerometer readings from the device.
*/
public boolean isCurrentlyDetecting() {
return mCurrentlyDetecting;
}

/**
* Sets the the amount of acceleration needed to trigger a "nudge".
* Units are metres per second per second (ms^-2)
*/
public void setDetectionThreshold(double threshold) {
mDetectionThreshold = threshold;
}

public double getDetectionThreshold() {
return mDetectionThreshold;
}

/**
* Sets the minimum amount of time between when startDetection() is called
* and nudges are actually detected. This should be non-zero to avoid
* false positives straight after enabling detection (e.g. at least 500ms)
*
* @param milliseconds_delay
*/
public void setGraceTime(int milliseconds_delay) {
mGraceTime = milliseconds_delay;
}

public int getGraceTime() {
return mGraceTime;
}

/**
* Sets how often accelerometer readings are received. Affects the accuracy of
* nudge detection. A new sample rate won’t take effect until stopDetection()
* then startDetection() is called.
*
* @param rate must be one of SensorManager.SENSOR_DELAY_UI,
* SensorManager.SENSOR_DELAY_NORMAL, SensorManager.SENSOR_DELAY_GAME,
* SensorManager.SENSOR_DELAY_FASTEST
*/
public void setSampleRate(int rate) {
mSampleRate = rate;
}

public int getSampleRate() {
return mSampleRate;
}

/**
* Starts listening for device movement
* after an initial delay specified by grace time attribute –
* change this using setGraceTime().
* Client Activities might want to call this in their onResume() method.
*
* The actual sensor code uses a moving average to remove the
* gravity component from acceleration. This is why readings
* are collected and not checked during the grace time
*/
public void startDetection() {
if (mEnabled && !mCurrentlyDetecting) {
mCurrentlyDetecting = true;
mSensorManager.registerListener(this, mAccelerometer, mSampleRate);

Handler myHandler = new Handler();
myHandler.postDelayed(new Runnable() {
@Override
public void run() {
if (mEnabled && mCurrentlyDetecting) {
mCurrentlyChecking = true;
}
}
}, mGraceTime);
}
}

/**
* Deregisters accelerometer sensor from the sensor manager.
* Does nothing if nudge detector is currently disabled.
* Client Activities should call this in their onPause() method.
*/
public void stopDetection() {
if (mEnabled && mCurrentlyDetecting) {
mSensorManager.unregisterListener(this);
mCurrentlyDetecting = false;
mCurrentlyChecking = false;
}
}

// SensorEventListener callbacks follow

@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}

@Override
public void onSensorChanged(SensorEvent event) {
// alpha is calculated as t / (t + dT)
// with t, the low-pass filter’s time-constant
// and dT, the event delivery rate

final float alpha = 0.8f;

mGravity[0] = alpha * mGravity[0] + (1 – alpha) * event.values[0];
mGravity[1] = alpha * mGravity[1] + (1 – alpha) * event.values[1];
mGravity[2] = alpha * mGravity[2] + (1 – alpha) * event.values[2];

mLinearAcceleration[0] = event.values[0] – mGravity[0];
mLinearAcceleration[1] = event.values[1] – mGravity[1];
mLinearAcceleration[2] = event.values[2] – mGravity[2];

// find length of linear acceleration vector
double scalarAcceleration = mLinearAcceleration[0] * mLinearAcceleration[0]
+ mLinearAcceleration[1] * mLinearAcceleration[1]
+ mLinearAcceleration[2] * mLinearAcceleration[2];
scalarAcceleration = Math.sqrt(scalarAcceleration);

if (mCurrentlyChecking && scalarAcceleration >= mDetectionThreshold) {
for (NudgeDetectorEventListener listener : mListeners)
listener.onNudgeDetected();
}
}
}

[/sourcecode]

The reason I stuck to using Sensor.TYPE_ACCELEROMETER was because I want to support Froyo with my app. If you’re only targeting 2.3 (API level 9) and higher, you could use Sensor.TYPE_LINEAR_ACCELERATION, and simplify this code a fair bit by stripping out the gravity calculation in onSensorChanged(), etc.

Feel free to use this in your projects. Drop me a comment if you spot bugs or have any suggestions.

Data on Android device supported features

I’ve recently been experimenting with OpenGL ES 2.0 on Android for a graphical app (some excellent guides can be found at http://www.learnopengles.com/). So far so good. It turns out that gone are the days of countless fixed function calls like glBegin() glVertex3f() glColor4f() for sending vertex data, nowadays you use shaders for everything and send your vertex data to OpenGL in large chunks.  Supposedly this makes the graphics driver software a lot simpler to write and leads to better performance overall. Keeping track of all of those calls and their corresponding closing calls could end up a bit of a headache so it seems like it provides some benefit to application developers too.

Before diving in and using ES 2.0 exclusively (well, at first anyway – code for ES 1.x support can always be added later) I wanted to get an idea of how widely ES 2.0 is supported across Android devices because it could have a big effect on the market size for my app.

After filtering through some anecdotal evidence on Stackoverflow, not surprisingly the best place to find this data was straight from the horse’s mouth at the Android Dashboards page.

According to the data, ES 2.0 support is over 90% and it seems reasonable to assume it’s only going to increase in time. So that settles it – OpenGL ES 2.0 it is.

The Dashboards page also has data on the installation base for each Android version which may also be very useful to you during the research phase of developing your app.

Linux: Fixing an unreliable network connection with ASUS P8Z 68-V onboard LAN

Recently I got a new ASUS P8Z 68-V motherboard and CPU, and had been having some strange network issues with it when running Gentoo Linux. The problems included connection failures after random periods of time and generally slow download speeds. The only way to get the connection running again after it failed (which was every few minutes at times) was to run:

ifconfig eth0 down
ifconfig eth0 up

On top of that, running ifconfig (with no args) was reporting 100% RX packets dropped for the interface.

This is the info on the network adapter:

$ lspci
...
07:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express
Gigabit Ethernet controller (rev 06)

At first I suspected there may have been something wrong with my (relatively new) Gentoo setup, so I did some testing running off an Ubuntu 11.10 live CD.  Interestingly this gave the same unreliable behaviour that wasn’t present when running Windows 7.

After further digging it turned out the problem was that the Linux kernel was loading the wrong module for the network adapter. This was confirmed by the presence of “r8169” in the output of lsmod.

The solution was to remove and blacklist the r8169 module and install Realtek’s official r8168 Linux kernel module from their website. On Gentoo I had compiled the kernel with the r8169 module built-in, so this meant first deselecting it and recompiling the kernel. After that, all that was left was to extract Realtek’s driver package and run ./autorun.sh as root.

Solution source: http://askubuntu.com/questions/46942/how-do-i-stop-my-ethernet-network-connection-from-dropping

Update:
In Linux Mint 12, you first need to run sudo apt-get install build-essential linux-headers-3.0.0-12-generic before running autorun.sh.

Nosetests: Capturing log messages written to stderr

This is a tip for using the Python logging module in conjunction with unit-tests.
When using the root logger to write debug messages, e.g.

 import logging
..
logging.debug('x = %s' % x)

to capture the messages and write them to the console when running nosetests, pass ‘root’ to the –log-debug nosetests option. E.g.

nosetests test_module1.py -s --log-debug=root

Ubuntu/Debian: see files installed by package

Checking the files installed by a package is easy enough with the Synaptic package manager (Right-click package > properties > installed files tab).

Here’s how to get that information without leaving the comfort (cough) of your bash terminal:

$ dpkg-query -L package_name

For example:

$dpkg-query -L flashplugin-installer
/.
/usr
/usr/lib
/usr/lib/xulrunner
/usr/lib/xulrunner/plugins
/usr/lib/mozilla
/usr/lib/mozilla/plugins
/usr/lib/iceape
/usr/lib/iceape/plugins
...

Vim: pasting text without it cascading

If you’re like me and use highlight + middle click pasting in Linux a lot, you may have tried to paste text in a Vim window at some point… with surprising results.

For example, highlight the following text:

import re
for test_string in ['555-1212', 'ILL-EGAL']:
    if re.match(r'^\d{3}-\d{4}$', test_string):
        print test_string, 'is a valid US local phone number'
    else:
        print test_string, 'rejected'

Pasting it into a Vim terminal window when in insert mode will give you something like this:

mport re
for test_string in ['555-1212', 'ILL-EGAL']:
    if re.match(r'^\d{3}-\d{4}$', test_string):
                print test_string, 'is a valid US local phone number'
                                else:
                                                        print test_string, 'rejected'

Eh?

To make it paste as expected, first use the following command in Vim:

:set paste

Paste your text and it will come out formatted correctly. When you’re done, unset it with:

:set nopaste

(Python code example from http://wiki.python.org/moin/SimplePrograms)

MySQL + ODBC + Python

How to connect Python programs to a MySQL database using ODBC on Ubuntu 10.04 LTS (Lucid)

This guide assumes you already have a MySQL server set up somewhere

  1. Install needed packages:

    sudo apt-get install unixodbc unixodbc-dev python-dev libmyodbc

    (libmyodbc is the MySQL driver for ODBC)

  2. Get current version of pyodbc:
    If you have python-setuptools installed:

    sudo easy_install pyodbc

    Or with pip (from python-pip):

    sudo pip install pyodbc

    Or if all else fails, download the latest source archive from https://code.google.com/p/pyodbc/downloads/list
    (I used v2.1.8) extract it somewhere on disk, cd into the directory, and run

    sudo python setup.py install

  3. Add a reference to MySQL driver to ODBC config file /etc/odbcinst.ini:

    [MySQL]
    Description = ODBC for MySQL
    Driver = /usr/lib/odbc/libmyodbc.so
    FileUsage = 1

  4. Test it:

    python
    import pyodbc
    cn = pyodbc.connect('DRIVER={MySQL};SERVER=localhost;DATABASE=test;UID=root;PWD=abc;')

For more examples of pyodbc usage the official documentation is very good: https://code.google.com/p/pyodbc/wiki/GettingStarted

This was pieced together from a number of sources which I’ll credit when I find the links again…