How to get a Double with two decimal points without round?

I've a Double number like

let num = 6.1699999999999999


I want this number with two decimal points like : 6.16


But with the tools and functions not make this. The solutions make a round became number to: 6.17


I tried:


ceil(num*100)/100


and


String(format: "%.2f", num)



Both the result is: 6.17.

So how I get the number with two decimal points without round?

I tried to use a NumberFormatter with no luck.


Maybe this is not the perfect approach but still a solution.



let number  = 16.1699999999999999

let stringFromNumber = String(number)

if let dotIndex = stringFromNumber.range(of: ".")?.upperBound {
    
     let charactersCount = stringFromNumber.count
    
    //print("\(charactersCount)") // 18
    
    //print(string[..<dotIndex]) // 16.
    
    let distancToDot = stringFromNumber.distance(from: stringFromNumber.startIndex, to: dotIndex)
    
    //print(distancToDot) // 3
    
    if charactersCount > (distancToDot + 1){
        let endIndex = stringFromNumber.index(dotIndex, offsetBy:2)
        print(stringFromNumber[..<endIndex])
    } else if charactersCount > distancToDot {
        let endIndex = stringFromNumber.index(dotIndex, offsetBy:1)
        print(stringFromNumber[..<endIndex])
    }
    
} else {
    print(stringFromNumber)
}


let number = 16.1699999999999999 // prints 16.16


let number = 16.16 // prints 16.16


let number = 16.1 // prints 16.1


let number = 16 // prints 16

Not as good, but shorter.


The problem is that there are so many digits, that it is automatically rounded.


So, get rid of a few trailing digits:

let number  = 16.1699999999999999

let stringFromNumber = String(String(number).dropLast())
let newNumber = Double(Int(100*Double(stringFromNumber)!))/100


You could also repeat the drop, in case you have even more digits

First of all, you should bettter not use `Double` when you want to control all the behavior abount rounding.

At any time, calculations on Double or converting Double to decimal representation would cause binary rounding, which may not be as expected from the decimal point of views.


let num = 6.1699999999999999
print(num == 6.17) //-> true


In Double, 6.1699999999999999 is exactly equivalent to 6.17.


Second, people tend to test one's own code with examples which works as expected, even if it is known that the code would not work as expected for some other examples.


func testManuelMB(_ number: Double) {
    let stringFromNumber = String(number)
    
    if let dotIndex = stringFromNumber.range(of: ".")?.upperBound {
        
        let charactersCount = stringFromNumber.count
        
        let distancToDot = stringFromNumber.distance(from: stringFromNumber.startIndex, to: dotIndex)
        
        if charactersCount > (distancToDot + 1){
            let endIndex = stringFromNumber.index(dotIndex, offsetBy:2)
            print(stringFromNumber[..<endindex])
        } else if charactersCount > distancToDot {
            let endIndex = stringFromNumber.index(dotIndex, offsetBy:1)
            print(stringFromNumber[..<endindex])
        }
        
    } else {
        print(stringFromNumber)
    }
}
testManuelMB(6.1699999999999999) //-> 6.17 (Wrong!!!)
testManuelMB(16.1699999999999999) //-> 16.16

func testClaude31(_ number: Double) {
    let stringFromNumber = String(String(number).dropLast())
    let newNumber = Double(Int(100*Double(stringFromNumber)!))/100
    print(newNumber)
}
testClaude31(6.1699999999999999) //-> 6.1 (Wrong!!!, shows only one digit...)
testClaude31(16.1699999999999999) //-> 16.16


Conclusion: use Decimal instead of Double, when you want to cotnrol rounding.

func testDecimal(_ number: Decimal) {
    var decNum = number
    var roundedNum = Decimal()
    NSDecimalRound(&roundedNum, &decNum, 2, .down)
    print(roundedNum)
}
testDecimal(Decimal(string: "6.1699999999999999")!) //-> 6.16
testDecimal(Decimal(string: "16.1699999999999999")!) //-> 16.16

You are right, I tried first this approach but it did not works.


As your answer, it uses decimal style and rounding Mode .down, but prints 16.17


import Foundation

let num = 16.1699999999999999
let numberFormatter = NumberFormatter()
numberFormatter.numberStyle = .decimal
numberFormatter.usesSignificantDigits = false

// Rounding down drops the extra digits without rounding.
numberFormatter.roundingMode = .down
numberFormatter.minimumFractionDigits = 2
numberFormatter.maximumFractionDigits = 2
let number = NSNumber(value:num)

if let stringFronmNumber = numberFormatter.string(from:number){
  print(stringFronmNumber)
}


Thanks for share your answer.

I also find this other way:


The scale argument is the number of decimals you want.


func testNumberAsString(_ numberAsString: String) -> NSDecimalNumber{
    let num = NSDecimalNumber.init(string: numberAsString)
    let behaviour = NSDecimalNumberHandler(roundingMode:.down, scale: 2, raiseOnExactness: false,  raiseOnOverflow: false, raiseOnUnderflow: false, raiseOnDivideByZero: false)
    let numRounded = num.rounding(accordingToBehavior: behaviour)

    return numRounded
}

print(testNumberAsString("6.1699999999999999")) //-> 6.16
print(testNumberAsString("16.1699999999999999")) //-> 16.16

I looked again at it and have another simple solution (tested).


// Problem is that the number 6.1699999999999999 IS in fact 6.17

// You can see it in playground by typing let val = 6.1699999999999999 : you get 6.17

// So, you have to work on the String itself, not the number


let val = 6.1699999999999999    // just to check that this val is 6.17
var s = "6.1699999999999999"
let resultOriginal = Double(Int(100*Double(s)!))/100    // To see the effect of dropLast

while s != String(Double(s)!) { // too many digits after dot, beyond Double.ulpOfOne precision dropLast until s == String(Double(s))
    s = String(String(s).dropLast())
}
let goodResult = Double(Int(100*Double(s)!))/100

print(resultOriginal, goodResult)

Log :

6.17 6.16


You can test by adding extra 9 at the end.

Just for completion (even though I prefer your solution).


You're right, the version of my first post was wrong (I do apologize for not testing correctly), that's why I posted the second one which works (with the example you provide notably):

var s = "6.1699999999999999"
let resultOriginal = Double(Int(100*Double(s)!))/100    // To see the effect of dropLast

while s != String(Double(s)!) { // too many digits after dot, beyond Double.ulpOfOne precision dropLast until s == String(Double(s))
    s = String(String(s).dropLast())
}
let goodResult = Double(Int(100*Double(s)!))/100

print(resultOriginal, goodResult)

In any case, the number is entered as a String, not a value.

I just created a function that can help... 3 years later, but I hope it can help someone looking for the answer:



func convert(_ a: Double, maxDecimals max: Int) -> Double {
    let stringArr = String(a).split(separator: ".")
    let decimals = Array(stringArr[1])
    var string = "\(stringArr[0])."

    var count = 0;
    for n in decimals {
        if count == max { break }
        string += "\(n)"
        count += 1
    }


    let double = Double(string)!
    return double
}

Fast forward, when you mix c++ with swift on the c++ side there’s no precision issue.

How to get a Double with two decimal points without round?
 
 
Q