Don’t use the Force, Luke

(tl;dr: force unwrapping in Swift with ! is bad)

At our last iOS Developer meetup in Limerick, my Worst Case Scenario podcast co-host Dave Sims gave an excellent talk on Optionals in Swift. I’m hoping that Dave will put the content online some day as he really managed to provide a simple yet powerful overview of what Optionals are. He also ended up being invited to give his talk to the CocoaHeads meetup in Indianapolis via Skype – check out Worst Case Scenario Episode 36 for the details about how that came about.

Dave mentioned in his talk that force unwrapping Optionals is generally a bad idea. Xcode doesn’t help the matter though by actually encouraging programmers to force-unwrap optionals:

This is terrible advice for programmers new (or not-so-new…) to Swift. Force unwrapping a nil optional will cause a runtime crash. The whole point of optionals is you make a whole class of bug, where a nil creeps in and causes havoc down the line, impossible. Force-unwrapping throws all of this away with the dubious upside of saving a line or two (and shutting up the compiler). You can argue that the Apple documentation is pretty clear about force unwrapping being a bad idea, but I feel the tools should be encouraging best behaviour. Dave mentioned three common unwrapping techniques:

  1. if let
  2. guard let
  3. The ?? nil coalescing operator to provide a default value

Dave also mentioned Chris Lattner’s recent appearance on Accidental Tech Podcast where Swift’s creator talked about the purpose of the guard statement was to allow early exit (Overcast link that jumps directly to that segment), a style which he is personally in favour of. I also like this style – it reduces the indentation for the main bit of the code. You just have to remember to actually exit the block, either by asserting or returning a default value.

Hey Siri, turn the Christmas lights on

My fellow podcast hosts on Worst Case Scenario had both got smart home gear for Christmas. I was feeling a bit left out so I wanted to hack together a low-budget version. Here it is in action:

Now I can use Siri (or the iOS Home app) to turn on my Christmas lights from anywhere!

Ingredients:

  • Raspberry Pi
  • Relay shield from the gear we bought for the hackathon last year (it’s marked FE_SR1y)
  • Christmas lights

The relay shield has a transistor and a diode built in, so I didn’t need to do any fandaglement with a circuit, just hooked up the + to 5v on the Pi, – to Ground, and the switching pin to one of the GPIO pins.

On the Pi, I installed the excellent Homebridge which allowed me to create a HomeKit accessory that Siri can talk to. I used the homebridge-gpio-wpi plugin to talk to the GPIO pins.

It’s obviously super fast on my home wifi network but it’s also surprisingly speedy going through the phone network: iPhone -> phone network -> Apple’s servers -> Apple TV -> Raspberry Pi -> lights takes about 1.3 seconds.

I am toying with the idea of creating a low-budget connected home using 433MHz RF remote controlled sockets which I could control from the Pi with a transmitter board. The other two lads on Worst Case Scenario are also expanding their systems based on the Amazon Echo – subscribe and keep up to date with how we’re getting on!

iOS – checking to see if a word with blank letters is valid using NSSet, NSArray, Core Data and SQLite

Annoying Scrabble words like azo are even more annoying when you use a blank
Annoying Scrabble words like azo are even more annoying when you use a blank

An interesting problem I had recently was to check to see if a string was valid word or not, comparing against a word list with over 170,000 entries.

Checking the string is easy, after loading the words into an NSArray, you can just call containsObject:

NSString *wordToFind = @“the”;
BOOL wordIsValid = [wordArray containsObject:wordToFind];

The code above takes 0.0310 seconds on my iPhone 5s.

It’s even faster with an NSSet:

NSString *wordToFind = @“the”;
BOOL wordIsValid = [wordSet containsObject:wordToFind];

Using an NSSet is over 300x faster than NSArray in this instance: 0.00001s!

Blankety blank

What if you wanted to search to see if the word was valid using blanks, like in Scrabble? NSPredicate makes this very easy:

NSString *wordToFind = @“th?”;
NSPredicate = [NSPredicate predicateWithFormat:@“SELF LIKE %@“,wordToFind];
NSArray *filteredArray = [wordArray filteredArrayUsingPredicate:predicate];

But then I looked at how long it took: 0.44 seconds. Ouch! It’s not acceptable to block the main thread for that long.  There’s a similar method for NSSet:

NSString *wordToFind = @“th?”;
NSPredicate = [NSPredicate predicateWithFormat:@“SELF LIKE %@“,wordToFind];
NSSet *filteredSet = [wordSet filteredSetUsingPredicate:predicate];

which took 0.47 seconds – even worse!

Benchmarking

At this point I decided to benchmark the results by running the test on 500 randomly selected words, where 5% of the letters were turned into blanks. Here’s a sample from my test word list of 500 words:

unclu?tered
succumbs
piggeries
pseu?opregnant
combat?ve

(I have no idea what ‘pseudopregnant’ means…).

Running the test with 500 different words and measuring the time it took to do each test enabled me to record the mean, minimum, maximum and standard deviation of each method. Unfortunately I had to run these tests on the iOS simulator – turns out there’s a subtle bug when repeatedly running an NSPredicate which causes a huge number of NSComparisonPredicate objects to be malloc’ed and not freed, causing the tests above to blow up due to memory pressure on my 5S.

Using the simulator, here’s the results of the test with filteredArrayUsingPredicate: and filteredSetUsingPredicate:

Method Average Min Max Standard Deviation
filteredArrayUsingPredicate:  0.15 0.12 0.34 0.03
filteredSetUsingPredicate:  0.16  0.14  0.54  0.03

Other in-memory methods

I then tried a litany of various methods on NSSet and NSArray to see if I could come up with something faster.

 Using blocks: indexOfObjectPassingTest

    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    NSUInteger index = [wordArray indexOfObjectPassingTest:^BOOL(id obj, NSUInteger idx, BOOL *stop) {
        return [predicate evaluateWithObject:obj];
    }];
    if(index != NSNotFound)
    {
//can return the found word using index
}

Prefiltering NSArray

This method pre-filters the word list by narrowing it down to only words which have the same first letter (unless we have a blank as the first letter) as the word we’re trying to match.

    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    
    NSString *firstLetter = [wordToCheck substringToIndex:1];
    NSArray *smallerArray;
    //prefiltering on first letter will only work if first letter is not blank
    if([firstLetter isEqualToString:@"?"])
    {
        smallerArray = wordArray;
    }
    else
    {
        NSPredicate *firstLetterPredicate = [NSPredicate predicateWithFormat:@"SELF BEGINSWITH %@",firstLetter];
        smallerArray = [wordArray filteredArrayUsingPredicate:firstLetterPredicate];
    }
    
    NSArray *filteredArray = [smallerArray filteredArrayUsingPredicate:predicate];
    if([filteredArray count] > 0)
    {
//return matched word here if needed
}

Prefiltering NSSet

    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    
    NSString *firstLetter = [wordToCheck substringToIndex:1];
    //prefiltering on first letter will only work if first letter is not blank
    NSSet *smallerSet;
    if([firstLetter isEqualToString:@"?"])
    {
        smallerSet = wordSet;
    }
    else
    {
        NSPredicate *firstLetterPredicate = [NSPredicate predicateWithFormat:@"SELF BEGINSWITH %@",firstLetter];
        smallerSet = [wordSet filteredSetUsingPredicate:firstLetterPredicate];
    }
    
    NSSet *filteredSet = [smallerSet filteredSetUsingPredicate:predicate];
    if([filteredSet count] > 0)
    {
//return matched word
}

Using a compound predicate on NSArray

What if we combined the above two methods into a compound predicate – where we check to see if the first letter is the same AND the word is like the word we’re searching for. This might result in a faster match as we can avoid using the expensive LIKE predicate if the first letters are not matching:

    NSPredicate *predicate;
    //if first letter is blank we have to fall back to using normal predicate
    if([firstLetter isEqualToString:@"?"])
    {
        predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    }
    else
    {
        predicate = [NSPredicate predicateWithFormat:@"SELF BEGINSWITH %@ AND SELF LIKE %@",firstLetter,wordToCheck];
    }
    
    
    NSArray *filteredArray = [wordArray filteredArrayUsingPredicate:predicate];
    if([filteredArray count] > 0)
    {
//retrieve matched word here
}

Using a compound predicate on NSSet

    NSString *firstLetter = [wordToCheck substringToIndex:1];
    NSPredicate *predicate;
    //if first letter is blank we have to fall back to using normal predicate
    if([firstLetter isEqualToString:@"?"])
    {
        predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    }
    else
    {
        predicate = [NSPredicate predicateWithFormat:@"SELF BEGINSWITH %@ AND SELF LIKE %@",firstLetter,wordToCheck];
    }
    
    NSSet *filteredSet = [wordSet filteredSetUsingPredicate:predicate];
    if([filteredSet count] > 0)
    {
//match word here
}

NSArray concurrent enumeration

NSArray also has a method:

- (void)enumerateObjectsWithOptions:(NSEnumerationOptions)opts 
                         usingBlock:(void (^)(ObjectType obj, NSUInteger idx, BOOL *stop))block;

which allows concurrent enumeration when you pass the NSEnumerationConcurrent option:

__block NSString *stringToReturn;
    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    [wordArray enumerateObjectsWithOptions:NSEnumerationConcurrent
                                usingBlock:^(id obj, NSUInteger idx, BOOL *stop)  {
                                    if([predicate evaluateWithObject:obj])
                                    {
                                        stringToReturn = obj;
                                        *stop = YES;
                                    }
                                }];
    if(stringToReturn)
    {
}

NSSet concurrent enumeration

NSSet offers an almost identical method, except there is no index parameter as NSSets are unordered:

    __block NSString *stringToReturn;
    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"SELF LIKE %@",wordToCheck];
    [wordSet enumerateObjectsWithOptions:NSEnumerationConcurrent
                                usingBlock:^(id obj, BOOL *stop)  {
                                    if([predicate evaluateWithObject:obj])
                                    {
                                        stringToReturn = obj;
                                        *stop = YES;
                                    }
                                }];
    if(stringToReturn)
    {
}

Results

Method Average Min Max Standard Deviation
indexOfObjectPassingTest  0.07 0.0001 0.31 0.04
Prefiltering NSArray  0.07 0.04 0.24 0.03
Prefiltering NSSet 0.08 0.05 0.41 0.03
NSArray compound predicate  0.09 0.06 0.33 0.03
NSSet compound predicate 0.10 0.08 0.26 0.03
NSArray concurrent enumeration  0.10 0.004 0.36 0.06
NSSet concurrent enumeration 0.10 0.01 0.31 0.06

We have managed to do a bit better with some of these methods, but we’re still nowhere near acceptable performance. Maybe Core Data might have some under-the-hood optimisations that might help us:

Core Data

I set up a simple Core Data stack, with one entity “Word” which had one attribute “word” (naming things is hard…). Here’s the method to retrieve a matching word in my data controller:

-(Word *)wordMatchingString:(NSString *)stringToMatch
{
    Word *wordToReturn;
    NSFetchRequest *request = [[NSFetchRequest alloc]init];
    //we want to retrieve all things
    NSEntityDescription *e = [[[persistenceController mom]entitiesByName]objectForKey:@"Word"];
    //set the entity description to the fetch request
    [request setEntity:e];
    NSPredicate *predicate = [NSPredicate predicateWithFormat:@"word LIKE %@",stringToMatch];
    [request setPredicate:predicate];
    //limit to one result
    [request setFetchLimit:1];
    
    NSError *error;
    //execute the fetch request and store it in an array
    NSArray *result = [[persistenceController mainContext] executeFetchRequest:request error:&error];
    //if not successful, throw an exception with the error
    if(!result)
    {
        [NSException raise:@"Fetch failed" format:@"Reason: %@",[error localizedDescription]];
    }
    
    //return the only object in the array
    wordToReturn = [result lastObject];
    return wordToReturn;
}

And here’s the method to match the word:

    Word *foundWord = [[WSDataController sharedController]wordMatchingString:wordToCheck];
    if(foundWord)
    {
}

The results for this were all over the place:

Method Average Min Max Standard Deviation
Core Data  0.10 0.001 0.76 0.06

I had high hopes for Core Data but it’s just not fast or consistent enough for this application.

Descent into SQLite

I was about to give up at this point. I asked Dave Sims if he had any advice and he suggested implementing a Directed Acyclic Word Graph (DAWG) which would be very efficient for finding matches – despite the opportunity to make “yo dawg” jokes I reckoned it might be a bit above my pay grade for some weekend pottering.

I’d read about some iOS developers preferring to deal directly with SQLite for their persistent storage, rather than using Core Data (which uses SQLite as one of its storage options). Relishing the opportunity to type in all caps, I followed a tutorial to get a basic SQLite setup going (the tutorial has a small error in, which XCode will catch with a warning, the solution is to replace the offending line with if (sqlite3_step(compiledStatement) == SQLITE_DONE) { ). Here’s my SQLite query:

    //SQLite uses underscores as the single character wildcard
    NSString *underscoreString = [wordToCheck stringByReplacingOccurrencesOfString:@"?" withString:@"_"];
    NSString *query = [NSString stringWithFormat:@"SELECT word FROM words WHERE word LIKE '%@'",underscoreString];
    NSArray *resultsArray = [[NSArray alloc] initWithArray:[self.dbManager loadDataFromDB:query]];
    if([resultsArray count]>0)
    {
}

This gave the following results:

Method Average Min Max Standard Deviation
SQLite 0.017 0.0014 0.056 0.004

Success!

Using SQLite not only resulted in faster matching, it was also much more reliable than any of the other methods, with a very small standard deviation. Obviously matching a string in an array of 170,000 strings is an edge case, and for most cases any of the other methods would have sufficed.

It’s also worth noting that I was able to run the Core Data and SQLite tests on my own iPhone 5S as they do not suffer the same NSPredicate memory leak as the other methods. Here are the results on the phone:

Method Average Min Max Standard Deviation
Core Data (iPhone 5S) 0.35 0.002 0.78 0.19
SQLite (iPhone 5S) 0.037 0.036 0.042 0.0006

Full results

Method Average Min Max Standard Deviation
filteredArrayUsingPredicate:  0.15 0.12 0.34 0.03
filteredSetUsingPredicate:  0.16  0.14  0.54  0.03
indexOfObjectPassingTest  0.07 0.0001 0.31 0.04
Prefiltering NSArray  0.07 0.04 0.24 0.03
Prefiltering NSSet 0.08 0.05 0.41 0.03
NSArray compound predicate  0.09 0.06 0.33 0.03
NSSet compound predicate 0.10 0.08 0.26 0.03
NSArray concurrent enumeration  0.10 0.004 0.36 0.06
NSSet concurrent enumeration 0.10 0.01 0.31 0.06
Core Data  0.10 0.001 0.76 0.06
SQLite 0.017 0.0014 0.056 0.004
Core Data (iPhone 5S) 0.35 0.002 0.78 0.19
SQLite (iPhone 5S) 0.037 0.036 0.042 0.0006

How I edit podcasts

I do much of the post-production editing for the two podcasts that I’m on: Worst Case Scenario with Baz Taylor and David Sims, and Atlantic 302 with Pat Carroll. Some people have been kind enough to compliment the production quality of the podcasts but I’m still frustrated when I hear how good some of my favourite podcasts sound. I’m still learning about this stuff, and I know I have a long way to go, but I thought I would document my current method. Critiques welcome!

I’m indebted to Jason Snell of SixColors who wrote a very detailed guide on how he edits his podcasts. Much of this guide is based on Jason’s recommendations. Like Jason, I use Logic Pro X for editing, specifically for its Strip Silence function which isn’t available in GarageBand. I add an extra pre-production step of noise reduction in Audacity, as an unacceptable level of hiss was creeping into the recordings without this.

Recording

We sometimes record in person, but most of our episodes are recorded remotely. We sit in our own houses and chat over Skype. We don’t actually record the Skype call, instead we each record our own audio locally and then I mix it together later. We have a pre-recording ritual which consists of three steps:

  1. I count us in to hit Record in GarageBand. This means that the audio tracks will start at roughly the same point.
  2. I ask for five seconds of silence at the start of recording. This provides a convenient place at the start of each track to get a sample of background noise for noise reduction.
  3. I turn up the volume on my headphones, hold the headphones to my mic, and ask each host to talk into their microphone separately. This marks a point where the same audio is appearing both on my audio track (through the headphones) and on the host’s track, identical audio which allows me to sync up the tracks. Of course it’s not truly simultaneous as Skype introduces latency, but it’s good enough.

Sharing

The other hosts on the podcast then send their GarageBand files to me. Baz and Dave are users of the native macOS Mail app, which has a feature to send large files called MailDrop. For Atlantic 302, Pat and I have a shared Dropbox folder that he copies his file over. I mention this because the files involved are large: Atlantic 302 is a 30-minute show and our separate audio files are about 270Mb. Editing podcasts takes up a lot of disk space! (As a side note, I save my working podcast files in a subfolder of Downloads, and exclude the Downloads folder from Time Machine, to prevent backups getting too big).

Export

GarageBand has a default voice recording setting, which adds lots of effects such as reverb. The GarageBand interface is a little confusing, so I created a blank ‘patch’ with no effects whatsoever. I apply this patch in the Library before exporting. I’ve uploaded this blank patch in case you want to use it: it needs to be copied to
~/Music/Audio Music Apps/Patches
– you’ll need to create this folder if it doesn’t exist already.

I export each GarageBand file using the Share > Export Song to Disk menu command, and I save as an uncompressed 16-bit AIFF file. Rather annoyingly, this step creates a stereo AIFF file even though the recorded track from the microphone is mono, I haven’t bothered to figure out how to change this, so I just continue the rest of my workflow in stereo, and then convert to a mono MP3 file at the end.

Noise reduction

Our recordings tend to contain some hiss. If this hiss isn’t removed, it can be amplified at a later stage when applying compression. I use Audacity’s noise reduction function to pretty aggressively reduce noise in the AIFF files before dragging them in to Logic. I open a blank audio file, hit Cmd-Shift-I and select the AIFF for editing. I find the section of the audio at the start where everyone is silent, and select most of it, checking that there isn’t any breathing or noise that isn’t present through the whole recording. Then I select Effect > Noise Reduction and click Get Noise Profile to sample the background noise. I deselect the audio previously selected (important to do this, or the next step will only reduce the noise on the selection) and select Effect > Noise reduction again. You’ll notice there are lots of parameters to adjust, here are the settings that I use, chosen out of trial and error:

noise-reduction-progressNoise reduction (dB): 24; Sensitivity 11.50; Frequency smoothing (bands): 10. The last option, Noise: , should be set to Reduce, not Residue.

This takes a long time, typically nearly two minutes on my 2015 MacBook Pro. I use this time to start exporting the other tracks from GarageBand, and dragging completed files into Logic Pro X. Once the noise reduction has finished, I choose File > Export Audio, and export as another 16-bit AIFF file.

Logic Pro X

Logic Pro X costs €200 in the Irish Mac App Store. I know this is a huge amount – which is why I edited the first few episodes of Worst Case Scenario in Audacity. But Logic has one feature called Strip Silence which saves a huge amount of time and I’m much quicker at editing once I switched.

To start, create a project in Logic. It doesn’t matter what input settings you use, as you will be importing the previously-recorded tracks. Drag your tracks in, you can even drag them in one go, if you do just make sure that you ‘create separate tracks’ in the dialog. Once you have done this, you can delete the original empty track that Logic added for you.

Syncing audio tracks

By now you should have your three audio tracks in Logic. To sync them up, I look for the audio at the start of the file where each host speaks into my mic via my headphones (described earlier). I drag each track to the left and right around this point until the audio is roughly in sync. I find it helpful to zoom in (by pinching the trackpad) to do this, you can sometimes align visually using the waveforms on the track.

Audio effects and EQ

The Inspector sidebar in Logic, showing the Compressor, EQ and Noise Gate applied to Dave's audio (left) with the master Compressor on the right
The Inspector sidebar in Logic, showing the Compressor, EQ and Noise Gate applied to Dave’s audio (left) with the master Compressor on the right

I apply previously saved patches for my co-hosts. You can download these patches: Thomas, Dave, Baz, Pat (you add these to the same folder as the blank patch explained above – Logic and GarageBand use the same plugins in the same folders). These patches contain an EQ (to emphasise and de-emphasise different sound frequencies), a compressor (to make quiet noises louder) and a noise gate (to ignore sound below a certain level). They are different based on certain factors, Baz, Pat and I use the same Pyle dynamic microphone but Baz and I both have fairly quiet voices, so they need boosting. Dave uses a Rode condenser microphone, which provides a lovely loud sound (the Pyle mics are very quiet) but tends to pick up more background noise. For my own patch, I added an extra DeEsser plug-in, as my ‘s’ sound is very hissy. Finally I add a master compressor on the output (it’s listed under Dynamics > Compressor) which effects all tracks, I turn up the compression to 3 and leave everything else at the default.

I try to avoid any clipping on the output meter (any positive number is clipping and will be shown in red), and will tweak the individual track compressor if someone’s audio is clipping, normally using the Make Up knob.

Strip Silence

Strip silence has been applied on Dave and Baz's tracks, but not on mine
Strip silence has been applied on Dave and Baz’s tracks, but not on mine

Now we get to where Logic earns its €200: the Strip Silence feature. This essentially translates a single continuous audio track into ‘blocks’ where someone is actually talking. This makes editing so much easier. The keyboard shortcut for Strip Silence is Ctrl-X (make sure you have the track highlighted), the settings I use are a tweaked version of Marco Arment’s recommendations – 2%/0.6/0.2/0.3 for those of us using dynamic microphones. I change the threshold to 4% for Dave’s microphone: as a condenser it picks up more background noise. As Marco points out, annoyingly Logic doesn’t remember your choices so you have to manually enter these in for every editing session.

Select following

At this point I delete all the chatter before the show starts, and I start editing proper. The one disadvantage of using Strip Silence is that you are left with a lot of empty sections where there are sections of the podcast where nobody is talking. This can be noticeable to the listener (depending on the background noise), so a lot of my editing work involves moving audio ‘back’ so there is no overlap. When you are faced with a period of silence, you want to select the next ‘block’ after the period of silence, and then use the Shift-F to select all blocks of audio following this block. You can then drag the audio back so that it cuts in just after the last person has finished talking.

Editing

As well as removing silences, you’ll often want to cut bits out. In the early days of Worst Case scenario, I was obsessive in cutting every last ‘um’ and ‘ah’, especially with my own voice. I’ve calmed down a bit now I’ve got used to listening my own voice (this took quite a while) and realising that these artefacts appear in everyday speech and the brain is incredibly good at filtering them out.

secondary-editing-toolI do remove any bumps, clicks or any other background noise that I think might distract the listener. Often the noise is in a single block so I can just click and delete. Sometimes you need to divide up a block of audio, the fastest way to do this is to set Logic’s secondary editing tool to be the marquee tool (see graphic). The primary tool should be the pointer tool. The secondary editing tool is invoked by holding down Cmd. You can use the marquee tool to drag over a selection of audio you want to delete, or by clicking once and hitting backspace you can split a block into two pieces.

Intros

On Atlantic 302 we use a spoken word intro and outro that myself and Pat take turns doing and which is recorded at the same time as each episode. For Worst Case Scenario, we use a short noise that Dave created while messing around with an app called cfxr, a little app that creates various sound effects. I turn the input volume down to -13db because otherwise the master compressor will make it too loud.

Export to iTunes

I prefer to export the song to iTunes as an AIFF (File > Share > Song to iTunes). The only metadata I add is the track name. From iTunes I have the export format set as 64 kbps mono (iTunes > Preferences > CD Import Settings > MP3 Encoder > Custom > 128kbps stereo bit rate + mono) so I just convert to MP3 after the AIFF is imported from Logic. Finally I right click the MP3 and Show In Finder to locate the file.

 

Announcing Direach, a new WordPress theme aimed at readers

As I was discussing on the last episode of Worst Case Scenario, I’m a bit of a luddite when it comes to the web. I like reading, and I dislike anything that distracts me from that task. Fortunately the Safari web browser comes with a great Reader mode which strips all the cruft from a page and simply displays the text of the main article, nicely formatted for reading (Firefox has this feature too). I use it constantly.

I decided a few years ago that I wanted my own website to be as readable as the special Reader mode in my browser. Every four months or so I would check to see if anyone had created a WordPress theme that could do the job, but I was never able to find anything suitable. So I wrote my own!

Direach (from díreach in Irish, meaning direct) is the fruit of my efforts, and it is the theme running on this site today. It is my effort to create a WordPress theme that is reader-friendly and accessible.

Typography

Direach doesn’t download a custom font, it uses the Georgia typeface which is a serif font with a relatively large x-height. Created by Matthew Carter for Microsoft specifically for screen use, it is the same font used by Safari’s Reader mode and is installed on the vast majority of computers, phones and tablets. A default serif font is declared for operating systems that do not ship with this font. The default paragraph font on most browsers will render at 18px, increasing to 20px on wider screens. Elements that are not part of the main body of the page have a reduced font size.

Reduced clutter

Almost all WordPress sites have a header at the top containing the site title, subtitle, and the navigation menu. This can take up a large proportion of the screen, especially if the reader is on a mobile device. Direach shifts this content down to the bottom of the article when it is only showing a single post or page. For index pages and the home page, this content is placed at the top. All content is displayed in a single column, apart from two columns of ‘widgets’ at the bottom of the front page. On single item pages/posts, the site title is appended to the article heading.

Small and fast

Not only does Direach not load any external fonts, it doesn’t add any JavaScript either. The markup is relatively clean. Even on my fairly slow US-based shared host, it loads relatively quickly.

Accessibility

Direach is based on the Underscores base theme by Automattic, and so it has excellent support for screen readers.

Future developments

Direach is licenced as GPL v2 or later and the source is available on GitHub. There are a few issues I still need to address – the image in the masthead does not deliver a retina size for devices with high-density screens, despite poking around in the source of get_header_image_tag I couldn’t find a way to get it to emit srcset attributes, despite this functionality being added in version 4.4.0. I’m not 100% happy with the final design and there are still a few rough edges.

I may submit this theme to wordpress.org in the future after a bit of dog fooding, although I think the design may break one or two of WordPress’s theme guidelines.

Reflections

It took quite a bit of work getting Direach to this state, in particular to get the theme working with some of the weird cases in the WordPress test data set. At various stages I worried whether this theme is too stark – almost Brutalist in nature. I also worry that it gives this site a slightly-too-authoritative tone. But overall I’m happy with the results.

Installing

To install this theme on your own site, go to Direach’s home on GitHub and under Clone or Download choose Download ZIP (your browser may automatically decompress the theme, in which case you will have to recompress it). Upload this theme to your WordPress install via Appearance > Themes > Add New > Upload Theme. You can upload a custom image beside the site title under the Appearance tab of the admin interface.

Three podcasters fix a WordPress plugin bug (with chat logs)

On Worst Case Scenario, the podcast I host with Baz Taylor and Dave Sims, we use a WordPress plugin called Seriously Simple Podcasting to host our podcast feed. It’s a fantastic minimalist plugin with little bloat and works very well for us.

We also installed an add-on called Seriously Simple Stats which would give us download counts for each episode of the podcast.

However the plugin wasn’t showing the client information properly and this was annoying Dave.

Screen Shot 2016-07-20 at 20.12.17
The three of us poked at it at various points during the day, and despite not being WordPress experts (and in my case, being very far from a PHP expert), we managed to get it working.

Screen Shot 2016-07-20 at 19.49.28

Podcast app User agent Strings

Here are some user agent strings reported by one or two podcast apps:

  • “AppleCoreMedia/1.0.0.13F69 (iPhone; U; CPU OS 9_3_2 like Mac OS X; en_ie)” – native iOS podcast app, amongst other things
  • “Pocket Casts” a podcast app that is particularly popular on Android
  • “Overcast/1.0 Podcast Sync (x subscribers; feed-id=y; +http://overcast.fm/)” – from Marco Arment’s Overcast app, which helpfully tells you how many subscribers you have in the agent string
  • “iTunes/12.4.2 (Macintosh; OS X 10.11.5) AppleWebKit/601.6.17” – downloads from the Mac version of iTunes

Here’s the bit of the plugin that detects what client you’re using

if ( stripos( 'itunes', $user_agent ) !== false ) {

$referrer = 'itunes';

Hm. There was no entry for AppleCoreMediaPlayer, which was why downloads from the native podcast app were being detected as ‘Other’. But we tried downloading episodes from iTunes on the Mac and that was still recorded as Other.

In fact, apart from plays from the website, every client was being recorded as ‘Other’ – except for Pocket Casts. What was going on?

Screen Shot 2016-07-20 at 19.44.03When your needle and your haystack are the same size

Screen Shot 2016-07-20 at 19.46.40The stripos($haystack, $needle) function in PHP looks to see if a string $needle is in the string $haystack, if it is, it returns an integer with the position of $needle in $haystack, otherwise it returns false.

Absentmindedly, I typed ‘php’ in a mac terminal expecting to get an error, but instead getting a blank prompt, indicating I could type PHP code enclosed in <?php ?> and hit ctrl-D to make it run (why does OS X ship with PHP? I have no idea). I tried a smaller test case:

Screen Shot 2016-07-20 at 19.51.22

Those of you reading probably have the aha! moment by now, but it took a bit of probing from Baz before I got it:

Screen Shot 2016-07-20 at 19.52.53

And that was also the reason why Pocket Casts was showing up correctly – because the entirety of the user agent string is “Pocket Casts”!  Anyhow, changing the function around to read

if ( stripos( $user_agent , 'itunes' ) !== false ) {

$referrer = 'itunes';
and adding AppleCoreMedia to the list got everything working fine. Well almost.

Double hits from the iOS podcasts app

We then hit another problem:

Screen Shot 2016-07-20 at 19.57.11

Turns out any downloads from the iOS Podcasts app were being counted twice. It seems like Podcasts first sends a HEAD request for the mp3 file, presumably to get size information so it can populate the progress indicator, with the user agent “Podcasts/2.4”. The app then sends a normal GET request for the mp3 file with the AppleCoreMedia user agent mentioned earlier.

Screen Shot 2016-07-20 at 20.01.55

Being lazy I just added a return statement if we detected the Podcasts user agent:

if(stripos( $user_agent,'podcasts/' ) !== false ) {
 			return;
 		}

I submitted it as a pull request and the maintainer merged it within a few hours. It was fun chatting away with Baz and Dave on iMessage while we tried to fix it.

Identifying Irish addresses by county

My friend Emmet called me with an interesting problem today. He had a spreadsheet with 28k rows. One of the columns was an address, sort of separated by commas. The address column was very inconsistent. Some ended in Ireland, others just had the county name, some used the form “County Limerick”, others “Co. Limerick”, and others still just “Limerick”. He needed to do some calculations by county so he needed to extract the county name for each row. As I learned doing the pubs research, Irish addresses are a pain!

I’d dealt with this sort of problem before (not with 28k rows mind!) when I used to work for national charities and it used to bug me terribly.

Emmet found a way to anonymise the data so he could send me a subset. I had a play around with a spreadsheet-based solution before breaking out the Python for a quick hack.

The script below reads in a file called “input.csv” and writes to a file called “output.csv” with the same data, but with the county name and a comma added to the start of each line (or Unknown if the script couldn’t work it out).

The script is case-insensitive, and matches the rightmost county on each line, so an address in Omeath, Co. Louth is correctly identified as being in Louth, and Dublin Rd., Athlone, Co. Westmeath matches Westmeath, not Dublin.

The script is fairly flexible about the structure of the input file, so the address data can be in different columns, or all in one.

Here’s the script:

###Usage: python parse-csv.py###
###Input file must be called input.csv###


#function definition: input_text is a string, all_ireland is a bool
def prepend_address_with_county(input_text,all_ireland):
    #prepare the county list
    counties_list = ['Carlow','Cavan','Clare','Cork','Donegal','Dublin','Galway','Kerry','Kildare','Kilkenny','Laois','Leitrim','Limerick','Longford','Louth','Mayo','Meath','Monaghan','Offaly','Roscommon','Sligo','Tipperary','Waterford','Westmeath','Wexford','Wicklow']
    #add on the six counties if we want all-Ireland
    if all_ireland == True:
        counties_list.extend(['Antrim','Armagh','Derry','Down','Fermanagh','Tyrone'])
    outfile = ''
    errorcount = 0
    linecount = 0
    #loop over each line
    for line in input_text:
        linecount += 1
        #keep track of the county we're going to feed in
        county_match = ''
        #let's keep track of the index of what we have found, we want the rightmost match
        old_find_index = 0
        #loop over all counties
        for county in counties_list:
            #look for the county, starting from the RHS
            # also convert to uppercase first
            find_index = line.upper().rfind(county.upper())
            #have we found anything? (find_index will be -1 if we haven't found anything)
            if find_index > old_find_index:
                #keep the county match
                county_match = county
                #update the rightmost index count
                old_find_index = find_index
        #have we got any matches?
        if old_find_index != 0:
            outfile += county_match+","+line
        else:
            outfile += "Unknown,"+line
            errorcount += 1
    return {"output":outfile,"errors":errorcount,"lines":linecount}




file = open("input.csv" ,'rU')
output_csv = open("output.csv",'w')
result_dict = prepend_address_with_county(file,True)
percentage_error = 100*result_dict["errors"]/result_dict["lines"]
print "%d lines processed, %d Unknown counties (%.2f%%)" % (result_dict["lines"], result_dict["errors"],percentage_error)
output_csv.write(result_dict["output"])

Escape to the pub with a Raspberry Pi, a pressure mat and push notifications

My father-in-law moved in with us two weeks ago. He is a lovely kind man in his late eighties and it’s a pleasure having him around.

He has Alzheimer’s and needs someone around him all the time during the day. We’re managing this OK while juggling the needs of the business. Unfortunately he can sometimes wake up in the middle of the night in a state of anxiety because he doesn’t know where he is.  This means that Sheila and I can’t get out to the local for a quick pint even after he’s gone to bed.

The mat of freedom, with the white cable on the top left going in to our bedroom
The mat of freedom, with the white cable on the top left going in to our bedroom

Enter the lazy programmer…

There’s nothing like the prospect of a pint to stir the motivations of a hacker!  I had a Raspberry Pi lying around the house. I’d been looking for an excuse to mess around with push notifications on the iPhone for a while, and I thought I could rig something up that would send a push notification to our phones if he left his bedroom during the night.

Internet of things mats

The pressure mat
The pressure mat

Our local electronics retailer Maplin sells a pressure mat – it’s a simple switch designed to be placed under a doormat. Hooking this up to a GPIO pin on the Pi would allow me to send a notification to our phones using Apple’s Push Notification Service. Here’s the general flow:

Pressure mat -> Raspberry Pi -> Apple Servers -> our phones

Not the most complicated of hacks by any means, but if it was going to allow us to get out for a pint it would be worth it.

Setting up the Pi

The Raspberry Pi beside the bed, with wires from the pressure mat connected to GND and GPIO pin 18 via an old telephone cable
The Raspberry Pi beside our bed, with wires from the pressure mat outside the door connected to GND and GPIO pin 18 via an old telephone cable

I bought the Pi two years ago with the aim of doing a bit of tinkering over Christmas, before realising that, although the Pi is an amazing platform at an amazing price, it’s still a Linux box and I didn’t want to be tinkering with .conf files to get things working over my Christmas holidays. But I’d been feeling vaguely guilty that this credit-card sized wonder was sitting gathering dust while kids a quarter of my age around the world were using it to achieve amazing things.

I downloaded the Raspbian Linux distribution from the Raspberry Pi website – choosing the Raspbian Jessie Lite distribution as I didn’t need any graphical niceties. I was hoping I could just boot up the Pi and get its IP address from the local network but ping 255.255.255.255 didn’t show anything up and my Apple AirPort Extreme router, although rock solid and wonderful in many ways, doesn’t show a list of ethernet clients, so I had to hook up a HDMI cable to finish the setup. Setting a static IP was a bit of a pain as the network stack changed significantly last year and most of the links I found had outdated information, but there’s a very comprehensive thread on StackExchange which explains what you need to do to dhcpd.conf (note to self, recent distros include a zeroconf implementation so you should be able to do something like ssh pi@rasbperrypi.local and it should get you in to a Pi with a fresh OS install)

Processing a push notification with an iOS device

This was actually the easiest bit of the whole process. Apple’s docs on push notifications are really good.

Here’s the entirety of the code of the iOS app, pretty much taken verbatim from Apple’s example code:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    // Override point for customization after application launch.
    // bitwise OR to include all notification types
    UIUserNotificationType types = (UIUserNotificationType) (UIUserNotificationTypeBadge |
                                                             UIUserNotificationTypeSound | UIUserNotificationTypeAlert);
    //register these notification types
    UIUserNotificationSettings *mySettings = [UIUserNotificationSettings settingsForTypes:types categories:nil];
    [[UIApplication sharedApplication] registerUserNotificationSettings:mySettings];
    // Register for remote notifications -- this will prompt the user to allow notifications
    [[UIApplication sharedApplication] registerForRemoteNotifications];
    
    return YES;
}

-(void)application:(UIApplication *)application didRegisterForRemoteNotificationsWithDeviceToken:(NSData *)deviceToken
{
    //triggered when user allows notifications
    //Log the token
    NSLog(@"device token: %@",deviceToken);
    // Also store the token in NSUserDefaults
    NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
    [defaults setObject:deviceToken forKey:@"deviceToken"];
    [defaults synchronize];
}

I wrote the device token to NSUserDefaults in case I wanted to display it in the app, but I took the lazy way out and just printed it with an NSLog statement, so I could copy it into my Python script.

Sending a push notification with a Python script

Here’s my Python script that detects someone stepping on to the mat, and which sends a request to Apple’s APNS server:

import RPi.GPIO as GPIO
import time
import datetime
import socket
from apns import APNs, Frame, Payload

GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)

# our pressure mat is connected to GPIO pin 18
button = 18

# setup the pin as input, and enable the pullup resister
# this means when the switch is pressed, it will read GND, i.e. False
GPIO.setup(button, GPIO.IN, GPIO.PUD_UP)

# insert the tokens you NSLogged from your iOS app here - long lines of hex
token_hex = ''
sheila_token = ''
# setting the payload here - we just want the default sound, and we don't care
# what the badge says so we'll just set it to 1 each time
payload = Payload(alert="Joe is on the move!", sound="default", badge=1)

# run in an infinite loop
while True:
    # if switch is pressed (reading False - see note on GPIO.setup() above)
    if GPIO.input(button) == False:
        #logging for debugging
        print str(datetime.datetime.now()), "button pressed"
        #create the push notification object
        apns = APNs(use_sandbox=True, cert_file='push-notification-cert.pem', key_file='push-notification-key.pem')
        time.sleep(1)
        # wrapping this in a try/except block as the script was occasionally
        # barfing on a socket error
        try:
            apns.gateway_server.send_notification(sheila_token, payload)
        except socket.error, (value, message):
            print str(datetime.datetime.now()),"Socket Error", message
            #recreate the apns object
            apns = APNs(use_sandbox=True, cert_file='push-notification-cert.pem', key_file='push-notification-key.pem')
        time.sleep(1)
        #we're done with Sheila's phone, now let's do the same for mine
        try:
            apns.gateway_server.send_notification(token_hex, payload)
        except socket.error, (value, message):
            print str(datetime.datetime.now()),"Socket Error", message
        time.sleep(10)

Apple recently changed their push notification protocol to a brand new version, which I tried to connect to using a Python HTTP/2 library called Hyper – it’s labelled as a ‘very early alpha’ and it may work very well but after a few hours of trying to create the HTTP/2 request I was still getting errors so I used a library that uses the old protocol called PyAPNS which worked fine although I had to insert a few sleep() statements to avoid some socket errors.  I also had to do some fandangling with the push notification certificate, splitting up the .p12 file into a certificate and key in Keychain Access to use with the script.

My Linux skills are a bit rusty so it took a while to remember to type screen then <CTRL-A> d to detach a separate terminal which would run even after I logged out of the SSH session.

It works!

Notification on my phone's lock screen when the mat is stepped on
Notification on my phone’s lock screen when the mat is stepped on

For all the ugly hacking, it seems to work! Sheila and I can nip down for a quick pint to our local knowing that we can quickly return home if her dad starts wandering.

Further work

I spliced in an old telephone cable to connect the pressure mat outside the door with the Raspberry Pi beside my bed. I was in a rush so I used sellotape! Aside from tidying that up into something a little less ugly, here’s the things I’d like to do next with the project

  1. Replace the Raspberry Pi with something like an ESP8266 module that could run off a battery, so I don’t need trailing wires. This post from Stavros Korokithakis shows how he recreated an Amazon Dash button using an ESP8266 looks promising.
  2. Using the ESP8266 would mean that I would have to handle the push notification logic on an external server, so that the ESP8266 would send a HTTP request to a Flask server which would send the request to Apple.
  3. Having the logic on the server means that I could log the data to a database, which means we could track how often the night interruptions were happening.
  4. which reminds me I should enter some logic so that the script only does push notifications and logging at night, sometimes when we’re going in and out of his room to do washing our phones go crazy!

Getting generic Arduino boards with CH340G chips to work with OS X 10.11 El Capitan

This evening my project was to get a generic Arduino Mega board that Baz ordered from China working with my MacBook Air running El Capitan. Plugging the board in and launching the Arduino IDE yielded nothing in Tools -> Port.

After a lot of searching, it seems like the problem is the USB to serial chip used on this Arduino clone is a CH340G chip, not the usual FTDI chip that ships with official Arduino boards.  This chip handles the communication between the USB connection and the serial interface to the Arduino board. Unfortunately it is not recognised by OSX.

ch340g-arduino
Note the CH340G chip on the top left

There is a driver online, provided by a Chinese manufacturer, which provides a kernel extension (ktext) to recognise the chip. Unfortunately the ktext is not signed.  Previous versions of OS X would let you disable the requirement for ktext signing by typing a sudo command in Terminal, but with El Capitan you have to boot to the recovery partition to disable the System Integrity Protection (SIP) which prevents unsigned ktexts from running.

Disabling SIP is a really bad idea! Fortunately there is a company called CodeBender which produces a plugin for Firefox and Chrome that gives you an in-browser Arduino IDE. They have released a signed ktext for the CH340G chips so you don’t have to mess around with the security settings on your Mac. It’s not possible to download the drivers directly, but if you install their add-on in Firefox or Chrome (doesn’t work in Safari), you’ll get a link to install the drivers. When you next launch the Arduino IDE, you’ll see your generic Arduino listed in the Tools > Port menu.

Here’s the link to install the add-on.

 

Using the KYX-5461AS 4-digit 7-segment LED display with Arduino

This part came with an Arduino starter kit bought in preparation for the first Limerick Hackathon. It was really difficult to find information on the web about this part.

KYX-5461AS

The part description said that this was a common Anode display, which is wrong, it’s a common cathode. This means that you need to connect resistors to the pins driving each digit (rather than each segment pin with a common anode) to avoid blowing the LEDs.

Here’s the pinout:

1 Segment E
2 Segment D
3 Decimal point
4 Segment C
5 Segment G
6 Digit 4
7 Segment B
8 Digit 3
9 Digit 2
10 Segment F
11 Segment A
12 Digit 1

Segment labels (I can never seem to remember these):
7 segment display labeled
So to display a number 1 on the third digit you would power pins 4, 7 and 8.

To display numbers on all 4 digits, you need to introduce a delay in between powering each digit in your loop() block.

To test I rigged up a LM35 temperature sensor to an Arduino board and displayed the temperature on the display. The display was a bit dim as the lowest resistors I had were 1kΩ which were too high.

temperature-sensor

Here’s the quick code I wrote to read the temperature from a LM35 temperature sensor and display it on the KYX-5461AS display.

//reading temperature and outputting to display
int sensorPin = 0;

//display pins
int segA = 5;
int segB = 13;
int segC = 10;

int segD = 8;
int segE = 7;
int segF = 4;
int segG = 11;
int segPt = 9;

int d1 = 6;

int d2 = 3;
int d3 = 2;
int d4 = 12;

int delayTime = 900;

int counter = 0;

float temperature = 77.7;

void setup() {
// put your setup code here, to run once:
//start serial communications
Serial.begin(9600);

//set up outputs
pinMode(12, OUTPUT);
pinMode(11, OUTPUT);
pinMode(10, OUTPUT);
pinMode(9, OUTPUT);
pinMode(8, OUTPUT);
pinMode(7, OUTPUT);
pinMode(6, OUTPUT);
pinMode(5, OUTPUT);
pinMode(4, OUTPUT);
pinMode(3, OUTPUT);
pinMode(2, OUTPUT);
pinMode(13, OUTPUT);
pinMode(0,INPUT);
delay(1000);
}

void loop() {

// put your main code here, to run repeatedly:

//only read temp every 100 cycles
if(counter%500 == 0)
{
// read the pin
int reading = analogRead(sensorPin);
//convert reading to volts
float volts = (reading * 5.0);
volts /= 1024.0;
// Serial.print(volts);
// Serial.println(" v");
temperature = volts * 100.0;
Serial.print(temperature);
Serial.println(" degrees Celsius");

//test output to display
// allHigh();
//reset our counter
counter = 0;
}

counter ++;

selectDigit(1);
sendDigit(tens(temperature));
delayMicroseconds(delayTime);

digitalWrite(d1, HIGH);
selectDigit(2);
sendDigit(ones(temperature));
point();
delayMicroseconds(delayTime);

digitalWrite(d2, HIGH);
selectDigit(3);
sendDigit(points(temperature));
delayMicroseconds(delayTime);
//turn point off

digitalWrite(d3, HIGH);
digitalWrite(segPt,HIGH);
selectDigit(4);
cee();
delayMicroseconds(delayTime);
digitalWrite(d4, HIGH);



}

void allLow() {

digitalWrite( 13, LOW); // A
digitalWrite( 2, LOW); // B
digitalWrite( 3, LOW); // C
digitalWrite(4, LOW); // D
digitalWrite(5, LOW); // E
digitalWrite( 6, LOW); // F
digitalWrite( 7, LOW); // G
digitalWrite(8, LOW); //point
digitalWrite(9, LOW);
digitalWrite(10, LOW);
digitalWrite(11, LOW);
digitalWrite(12, LOW);

}

void allHigh() {

digitalWrite( 13, HIGH); // A
digitalWrite( 2, HIGH); // B
digitalWrite( 3, HIGH); // C
digitalWrite(4, HIGH); // D
digitalWrite(5, HIGH); // E
digitalWrite( 6, HIGH); // F
digitalWrite( 7, HIGH); // G
digitalWrite(8, HIGH); //point
digitalWrite(9, HIGH);
digitalWrite(10, HIGH);
digitalWrite(11, HIGH);
digitalWrite(12, HIGH);

}

void one()
{
digitalWrite(segA, LOW);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, LOW);
digitalWrite(segE, LOW);
digitalWrite(segF, LOW);
digitalWrite(segG, LOW);
digitalWrite(segPt, LOW);

}
void two()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, LOW);
digitalWrite(segD, HIGH);
digitalWrite(segE, HIGH);
digitalWrite(segF, LOW);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void three()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, LOW);
digitalWrite(segF, LOW);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void four()
{
digitalWrite(segA, LOW);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, LOW);
digitalWrite(segE, LOW);
digitalWrite(segF, HIGH);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void five()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, LOW);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, LOW);
digitalWrite(segF, HIGH);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void six()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, LOW);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, HIGH);
digitalWrite(segF, HIGH);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void seven()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, LOW);
digitalWrite(segE, LOW);
digitalWrite(segF, LOW);
digitalWrite(segG, LOW);
digitalWrite(segPt, LOW);

}
void eight()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, HIGH);
digitalWrite(segF, HIGH);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void nine()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, LOW);
digitalWrite(segF, HIGH);
digitalWrite(segG, HIGH);
digitalWrite(segPt, LOW);

}
void zero()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, HIGH);
digitalWrite(segC, HIGH);
digitalWrite(segD, HIGH);
digitalWrite(segE, HIGH);
digitalWrite(segF, HIGH);
digitalWrite(segG, LOW);
digitalWrite(segPt, LOW);

}

void cee()
{
digitalWrite(segA, HIGH);
digitalWrite(segB, LOW);
digitalWrite(segC, LOW);
digitalWrite(segD, HIGH);
digitalWrite(segE, HIGH);
digitalWrite(segF, HIGH);
digitalWrite(segG, LOW);
digitalWrite(segPt, LOW);
}

void point()
{
digitalWrite(segPt, HIGH);
}

void selectDigit(int d)
{
/*
digitalWrite(d1,HIGH);

digitalWrite(d2,HIGH);
digitalWrite(d3,HIGH);
digitalWrite(d4,HIGH);
*/

switch (d)
{
case 1:
digitalWrite(d1, LOW);
break;
case 2:
digitalWrite(d2, LOW);
break;
case 3:
digitalWrite(d3, LOW);
break;
default:
digitalWrite(d4, LOW);
break;
}
}

void sendDigit(int x)
{
switch(x)
{
case 1:
one();
break;
case 2:
two();
break;
case 3:
three();
break;
case 4:
four();
break;
case 5:
five();
break;
case 6:
six();
break;
case 7:
seven();
break;
case 8:
eight();
break;
case 9:
nine();
break;
case 10:
cee();
break;
default:
zero();
break;
}
}

int tens(float x)
{
float divided = x/10.0;
return (int)divided;
}

int ones(float x)
{
float divided = x - (10.0 * tens(x));
// Serial.print(divided);
// Serial.println(" ***ones***");
return (int)divided;
}

int points(float x)
{
float divided = x - (10.0 * tens(x)) - ones(x);
divided *= 10;
return (int)divided;
}