That result happens for the same reason that 1 / 3 * 3 != 1, if you use decimal: 1 / 3 = .333, .333 * 3 = .999, which is different from 1.00.
0.1 is the same as 1 / 10, which does not have a finite representation in binary notation, just as 1 / 3 does not have a finite representation in binary or decimal notation.
0.1 is the same as 1 / 10, which does not have a finite representation in binary notation, just as 1 / 3 does not have a finite representation in binary or decimal notation.