Christopher
Stoll

Minimum Functions in Swift Compared

So, Apple has just released a new programming language called Swift, and naturally I wanted to check it out. I started by writing example programs in the playground as I read through the Swift iBook. That was fun and all, but I wanted to compile something and create an actual program. Apple said that Swift was fast, so I also wanted to benchmark it against C, which is my de facto point of reference for fast. I recently wrote a post where I compared minimum functions in C, so I though it should be trivial to port that code to Swift and test it out. The porting was easy, and the functions ran about 20 times slower than the C equivalents (which is actually not bad, in almost all cases it’s a good trade-off to decrease development time), but I found something else that surprised me. The built-in min function was insanely slower than the ones I hand coded. Whereas the test function using my hand-coded min3a() function took, on average, 3446 microsecond in C and 68,085 microseconds in Swift, the test function using the built-in min() function took 16.87 seconds to run.

First, it’s important to remember that this is still beta software. I’m sure Apple will make some optimizations to it before it reaches gold master. Second, this test is running each minimum function 1,200,000 times, which is way more than anyone is likely to do in production. Even with those caveats, I was still curious about what is taking so much time.

I thought of two possible reasons. The first possibility is that the use of generics (templates in C++ parlance) was costing extra clock cycles. As you would expect from C, the code I wrote can only handle a single data type, integers. The second possibility I considered was that the overhead associated with handling a variable number of parameters was causing the slowdown. My hand-coded solution can handle three parameters, no more, no less. So, I wrote additional functions to perform the minimum using generics and using variadic parameters.

It turns out that using generics doubles the amount of time required, and using variadic parameters makes them amount of processing time skyrocket. The variadic version of the minimum function always takes O(n) time since it must check every possible number, though that probably does not account for the entire increase. Below is a table which compares the results.

Method Time (microseconds)
C, min3a 3446
C, min3b 4128
Swift, min3a 68,085
Swift, min3b 73,437
Swift, min3a_generics 130,801
Swift, min3a_variadic 26,936,408
Swift, Built-in 16,871,645

My conclusion is that if a function is to be run frequently, for example in an often run loop, then it might be worthwhile to avoid using a varaidic function, even if the function is built-in. But, in general, this performance difference, even though it looks drastic, is probably not going to be noticed. Again, this all should be re-tested once the Xcode 6 is out of beta.

Below is the code I used to perform the tests, I used the following command to compile it into a stand alone command line program (after installing Xcode 6 Beta and switching to the latest command line tools):

xcrun swift min.swift \
-sdk $(xcrun --show-sdk-path --sdk macosx)
import CoreFoundation

let TEST_ITTERATIONS = 100.0
let TEST_LOOPS = 100_000

func min3a(a: Int, b: Int, c: Int) -> Int {
    if a < b {
        if a < c {
            return a
        } else {
            return c
        }
    } else {
        if b < c {
            return b
        } else {
            return c
        }
    }
}

func min3a_generics<T: Comparable>(a: T, b: T, c: T) -> T {
    if a < b {
        if a < c {
            return a
        } else {
            return c
        }
    } else {
        if b < c {
            return b
        } else {
            return c
        }
    }
}

func min3a_variadic(possibleValues: Int...) -> Int {
    var minValue = Int.max
    for possibleValue in possibleValues {
        if possibleValue < minValue {
            minValue = possibleValue
        }
    }
    return minValue
}

func min3b(a: Int, b: Int, c: Int) -> Int {
    var result = a
    if b < result {
        result = b
    }
    if c < result {
        return c
    }
    return result
}

func testA() {
    var result = 0
    for i in 0..TEST_LOOPS {
        result = 0
        result += min3a(1, 2, 3)
        result += min3a(1, 3, 2)
        result += min3a(2, 1, 3)
        result += min3a(2, 3, 1)
        result += min3a(3, 1, 2)
        result += min3a(3, 2, 1)
        
        if (result != 6) {
            println("Bad min (method A, group 1, loop \(i)")
        }
        
        result = 0
        result = min3a(Int.max-2, Int.max-1, Int.max)
        result = min3a(Int.max-2, Int.max, Int.max-1)
        result = min3a(Int.max-1, Int.max-2, Int.max)
        result = min3a(Int.max-1, Int.max, Int.max-2)
        result = min3a(Int.max, Int.max-2, Int.max-1)
        result = min3a(Int.max, Int.max-1, Int.max-2)
        
        if (result != Int.max-2) {
            println("Bad min (method A, group 2, loop \(i)")
        }
    }
}

func testB() {
    var result = 0
    for i in 0..TEST_LOOPS {
        result = 0
        result += min3b(1, 2, 3)
        result += min3b(1, 3, 2)
        result += min3b(2, 1, 3)
        result += min3b(2, 3, 1)
        result += min3b(3, 1, 2)
        result += min3b(3, 2, 1)
        
        if (result != 6) {
            println("Bad min (method B, group 1, loop \(i)")
        }
        
        result = 0
        result = min3b(Int.max-2, Int.max-1, Int.max)
        result = min3b(Int.max-2, Int.max, Int.max-1)
        result = min3b(Int.max-1, Int.max-2, Int.max)
        result = min3b(Int.max-1, Int.max, Int.max-2)
        result = min3b(Int.max, Int.max-2, Int.max-1)
        result = min3b(Int.max, Int.max-1, Int.max-2)
        
        if (result != Int.max-2) {
            println("Bad min (method B, group 2, loop \(i)")
        }
    }
}

func testC() {
    var result = 0
    for i in 0..TEST_LOOPS {
        result = 0
        result += min(1, 2, 3)
        result += min(1, 3, 2)
        result += min(2, 1, 3)
        result += min(2, 3, 1)
        result += min(3, 1, 2)
        result += min(3, 2, 1)
        
        if (result != 6) {
            println("Bad min (method C, group 1, loop \(i)")
        }
        
        result = 0
        result = min(Int.max-2, Int.max-1, Int.max)
        result = min(Int.max-2, Int.max, Int.max-1)
        result = min(Int.max-1, Int.max-2, Int.max)
        result = min(Int.max-1, Int.max, Int.max-2)
        result = min(Int.max, Int.max-2, Int.max-1)
        result = min(Int.max, Int.max-1, Int.max-2)
        
        if (result != Int.max-2) {
            println("Bad min (method C, group 2, loop \(i)")
        }
    }
}

var totalTime = 0.0
for i in 0..TEST_ITTERATIONS {
    let timeStart = CFAbsoluteTimeGetCurrent()
    testA()
    let timeStop = CFAbsoluteTimeGetCurrent()
    let fullTime = timeStop - timeStart
    totalTime += fullTime
    println("Method A: \(fullTime) us ")
}
println("A Avg: \(totalTime / TEST_ITTERATIONS) us")

totalTime = 0.0
for i in 0..TEST_ITTERATIONS {
    let timeStart = CFAbsoluteTimeGetCurrent()
    testB()
    let timeStop = CFAbsoluteTimeGetCurrent()
    let fullTime = timeStop - timeStart
    totalTime += fullTime
    println("Method B: \(fullTime) us ")
}
println("B Avg: \(totalTime / TEST_ITTERATIONS) us")

totalTime = 0.0
for i in 0..TEST_ITTERATIONS {
    let timeStart = CFAbsoluteTimeGetCurrent()
    testC()
    let timeStop = CFAbsoluteTimeGetCurrent()
    let fullTime = timeStop - timeStart
    totalTime += fullTime
    println("Method C: \(fullTime) us ")
}
println("C Avg: \(totalTime / TEST_ITTERATIONS) us")
Published: 2014-06-05
CSwiftMinimum