Chris Ladd

Selected Portfolio

Chris Ladd

Selected Portfolio

Apps

I design, program, and publish apps with my one-person company, Better Notes. Here are a few I’m most proud of:

I think of my work as an engineer as a gardener—refining modules and frameworks for the long haul. Each of my apps is built on top of dozens of internal frameworks, many of which will likely outlast the apps that use them.

Below are a few of my favorite libraries, with code examples.

Rohr

Rohr is a music theory library, named after my college music professor. It understands notes, intervals, chords, scales, and more.

import Rohr

// Notes & Intervals
let note: Note = .G
let note2 = note.transposed(up: .P5) // .D

// Chords
let chord = Chord.maj7

chord.displayName // "maj7"
chord.intervals // [.P1, .M3, .P5, .M7]

let chordInstance = chord.instance(with: .C)
chordInstance.notes  // [.C, .E, .G, .B]

// Progressions
let progression = Progression([.ii, .V, .I])
let chord = Chord.major.instance(with: .C)
let progInstance = progression.instantiate(chord: chord, as: .I)
progInstance?.chords // Dm, G, C

Having an internal library that understands the building blocks of music, and can grow with the needs of my products, gives me a stable foundation for my family of music apps.

Hendrix

Hendrix is to the guitar what Rohr is to music. Hendrix understands strings and frets, can generate voicings for chords, and encode and decode chords to and from a bespoke compressed format called gtx.

import Hendrix

let chord = Hendrix.chord(.Bm)
let frets = chord.fingerings.map { $0.fret } // [-1, 2, 4, 4, 3, 2]

if chord.isTransposable {
    let gChord = chord.transpose(to: .G) // result: Gm, at 10th fret
}

let collection = Hendrix.canonicalChords(root: .D, kind: .minor)
let chord = collection?.preferred?.first // returns an open Dm chord

The library can generate chords in any tuning, and comes with a built-in canon of over 14,000 hand-picked chords–I worked for over a year with a Berklee professor to build internal tools to generate, sort, and filter chords for each of over 80 types of chords in standard tuning.

Along the way, the rules we built into the library to pre-filter chords got good enough that my alternate tunings app was able to generate high-quality voicings on the fly.

GuitarUI

GuitarUI is a series of components to represent chords and scales visually. Fretboards, diagrams, controls, and more.

A Fretboard, for example, is an extensible view that represents a guitar fretboard:

import GuitarUI

// create a fretboard
let fretboard = Fretboard(frame: bounds)
addSubview(fretboard)

Layout providers allow arbitrary visual information to be placed above specific strings and frets, similar to UICollectionView.

let fingeringLayout = FingeringLayout()
fingeringLayout.dotContentStyle = .fingerInterval
fingeringLayout.displayStyle = .monochromeRound
fretboard.addLayoutProvider(fingeringLayout)

let capoLayout = CapoLayout()
fretboard.addLayoutProvider(capoLayout)

Showing a chord, for example, one would update providers, and then ask the fretboard to lay itself out:


// here, a Gmaj7 chord, capo'd at the 5th fret
let chord = Hendrix.chord(.Gmaj7).capo(5)


// update our providers
capoLayout.fret = chord.capo


// chord conforms to the `FingeringCollection` protocol. 
// the same layout can be used for scale patterns,
// or other arbitrary collections of fingerings
fingeringLayout.collection = chord 

// lastly, update the fretboard
fretboard.setNeedsLayout()
fretboard.scrollTo(fret: chord.lowestFingeredFret)

Flexible components like Fretboard allow component reuse within and across apps. And lower-level frameworks like Rohr and Hendrix allow clients to easily integrate and depend on GuitarUI.

Holland

Named for my music tech professor, Holland provides tools for audio playback and analysis.

Holland provides tools like a sampler, to play sf2 soundfont files. This powers the guitar components in GuitarUI, and the piano in PianoUI, for example:

let sampler = Sampler()
sampler.soundFontURL = Bundle.main.url(forResource: "guitar", withExtension: "sf2")

let note = NoteInstance(note: .C, octave: 4)
sampler.play(note: note.pitch)

Looper makes it trivial to loop audio, and keep UI elements in sync.

let looper = Looper()
looper.setItemWith(url: URL(string: "http://www.example.com/mygreatloop.mp3")!)
looper.onSecondsElapsed = { seconds in
    // simple seconds handler
}

looper.periodicUpdateInterval = 0.1
looper.onPeriodicUpdate = { totalTime, positionInLoop in
    // update UI for custom, fine-tuned intervals
}

looper.play()

AudioGraph vastly simplifies real-time audio processing, maintaining an internal AVAudioEngine and handling setup, interruptions, and tear-down for clients.

let audioGraph = AudioGraph()
audioGraph.startEngineIfNecessary()

audioGraph.getInput(clientId: "my_client") { pcmBuffer, time in
    // process audio buffers...
}

Perfect

Perfect builds on Holland by supplying configurable pitch and chord detection. These are used in features like the tuners inside apps, as well as chord flashcards and games that listen as students play.

For example, to find the fundamental frequency of the current microphone input:


// set up a pitch detector
let pitchDetector = MonoPitchDetector()

// get input
audioGraph.getInput(clientId: "pitch_detector") { pcmBuffer, time in
    
    // append the buffer
    pitchDetector.append(buffer: pcmBuffer)

    // and check for a detected frequency
    if let freq = pitchDetector.frequency,
        let note = pitchDetector.note {
        
        // pitch detector has detected a fundamental frequency
        // and found the closest musical note to that frequency!
        //
        // generally, detection is run on a timer, and not each time 
        // buffers arrive to both smooth the visual result, 
        // and conserve resources
    }
}

Perfect also includes support for polyphonic pitch detection, matching chords of interest:

let chordDetector = ChordDetector()

let targetChords = [
    Hendrix.chord(.C),
    Hendrix.chord(.F),
    Hendrix.chord(.G),
    Hendrix.chord(.Am)
]

// listen for the target chords.
chordDetector.listen(for: targetChords) { results in
    guard let results = results else { return }
    
    // sometimes, clients simply want to know if
    // a given chord was detected.
    if results.contains(chord: Hendrix.chord(.C)) {
        // C was detected!
    }
    
    // sometimes clients want to inspect all matched
    // chords for themselves
    for result in results {
        print("\(result.chord.name) - \(result.confidence)")
    }
}

// get mic input, and process buffers as they come in
audioGraph.getInput(clientId: "chord_detector") { pcmBuffer, time in
    // process pcm buffers. the detector's handler will be called
    // with any matches
    chordDetector.process(buffer: pcmBuffer)
}

NYTimes for iOS

I started on the newsreader and grew to be the primary engineer responsible for our iPad app—I wrote the first rich-text layout in Core Text for the app, worked with the newsroom to integrate new interactive modules for election night, and built deeply (perhaps too deeply) complex logic for laying out news packages.

iOS7 Redesign

We had very little time to turn around a major overhaul of the NYTimes apps when Apple announced iOS 7 in June of 2013. I was the lead engineer on iPad at the time.

We were able to transform the iPad into a continuous, scrolling sectionfront, with expandable sections that aimed for the serentipity of the printed paper, alongside the depth and immediacy of an online news source.

I was proud of the flexibility we were able to provide the newsroom—news packages could grow or shrink to fit major news events, and the graphics and interactives teams were able to build completely custom modules to fit seamlessly with the native section fronts.

Cooking

In 2014, I joined Brian Hamman’s new Beta group as the first iOS engineer on a new app that would be called Cooking, digitizing The Times’ (then) 16,000 recipe archive. We were a very small team, with Cooking for iOS consisting of just me and a designer for several months, which let us iterate very quickly. A big part of our mandate was to embrace native interactions and animations.

We were able to design, prototype, test, build, and ship the app in just over eight months.

Mobile 2

Mobile 2 was a project led by Alex Hardiman and Sam Dolnick. Our mandate was, through user research, interviews, design, and prototypes, to find what The Times’ mobile apps could and should be.

We interviewed users, ran fully newsroom-staffed prototypes, and met with product owners at Google, Facebook, AirBnb, Twitter, and others.

I built a system called “Bricks” to power our prototypes—feed-driven, atomic pieces of layout that could be recombined and configured from the server.

Bricks

Following the mobile2 project, I was inspired by the technology I built our prototype with–JSON building blocks for individual components like text, images, video, and more, composed to make screens and apps.

I was given a small team of developers (myself on iOS, a backend engineer, an android engineer, and a mobile-web engineer) and we ran a “tiger team”, competing against ReactNative and hybrid (e.g. HTML-based) apps as ways The Times could stay agile and innovate cross-platform.

We built prototypes that recreated much of The Times’ existing newsreader across iOS, Android, and MobileWeb.

Design

Design work had always been an interest, and a challenge, for me. Throughout my favorite projects at The Times, the design and definition of the product had always been my favorite part of the process.

Improbably, after mobile2 and the Bricks project wrapped up, I asked for, and was given, a trial period as a Product Designer.

I turned out to be coachable and capable. I became the product designer for the newly formed Mobile Homescreen team, designing components for our mobile newsreader platforms across iOS, Android, and the mobile web. I designed features balancing requests from product, advertising, and the newsroom, and drank in feedback from other designers during our daily “show and tells,” learning as much from the design problems that my colleagues struggled with as I did from their feedback on my own.

For my last assignment on that team, I was drafted to help redesign The Times’ desktop homepage, another small interdisciplinary team with members from the newsroom, product, project, design, and engineering.