Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Add / subtract characters as Int in Swift

I need to implement an algorithm to check if an input is valid by calculating a modulo of a String.

The code in Kotlin:

private val facteurs = arrayOf(7, 3, 1)

private fun modulo(s: String): Int {
    var result = 0
    var i = -1
    var idx = 0
    for (c in s.toUpperCase()) {
        val value:Int
        if (c == '<') {
            value = 0
        } else if (c in "0123456789") {
            value = c - '0'
        } else if (c in "ABCDEFGHIJKLMNOPQRSTUVWXYZ") {
            value = c.toInt() - 55
        } else {
            throw IllegalArgumentException("Unexpected character: $c at position $idx")
        }
        i += 1
        result += value * facteurs[i % 3]
        idx += 1
    }
    return result % 10
}

This implies doing math operations on the characters.

Is there an elegant way to do this in Swift 3 and 4?

I tried some cumbersome constructs like this :

value = Int(c.unicodeScalars) - Int("0".first!.unicodeScalars)

But it does not even compile. I'm currently using Swift 4 with XCode9, but Swift3 answer is welcome too.

like image 386
Xvolks Avatar asked Sep 13 '25 18:09

Xvolks


2 Answers

You can enumerate the unicodeScalars view of a string together with the running index, use switch/case pattern matching, and access the numeric .value of the unicode scalar:

func modulo(_ s: String) -> Int? {
    let facteurs = [7, 3, 1]
    var result = 0
    for (idx, uc) in s.uppercased().unicodeScalars.enumerated() {
        let value: UInt32
        switch uc {
        case "<":
            value = 0
        case "0"..."9":
            value = uc.value - UnicodeScalar("0").value
        case "A"..."Z":
            value = uc.value - UnicodeScalar("A").value + 10
        default:
            return nil
        }
        result += Int(value) * facteurs[idx % facteurs.count]
    }
    return result % 10
}

This compiles with both Swift 3 and 4. Of course you could also throw an error instead of returning nil for invalid input.

Note that "<", "0", "9" etc. in the switch statement are inferred from the context as UnicodeScalar, not as String or Character, therefore "0"..."9" (in this context) is a ClosedRange<UnicodeScalar> and uc can be matched against that range.

like image 128
Martin R Avatar answered Sep 16 '25 07:09

Martin R


Something like this works for me:

"A".utf16.first! + 2 //comes out to 67

Careful with the forced unwrap "!"

If you need the scalars value you can do

"A".unicodeScalars.first!.value + 2

More reading can be done on this here in the SPL.

For the c Character type value you could do this:

String(c).unicodeScalars.first!.value + 2

Here is an attempt to mod the function:

func modulo(s: String) -> Int? {
var result = 0
var factors = [7,3,1]
for (i, c) in s.uppercased().characters.enumerated() {
    let char = String(c)
    var val: Int
    if char == "<" {
        val = 0
    } else if "0123456789".contains(char) {
        val = Int(char.unicodeScalars.first!.value - "0".unicodeScalars.first!.value)
    } else if "ABCDEFGHIJKLMNOPQRSTUVWXYZ".contains(char) {
        val = Int(char.unicodeScalars.first!.value - 55)
    } else {
        return nil
    }

       result += val * factors[(i) % 3]
    }

    return result % 10
}

This is in swift 3...in 4 I believe you can just iterate over the string without converting to Chars

like image 33
Donovan King Avatar answered Sep 16 '25 07:09

Donovan King