Many fields of study invent their own terminology for their use by co-opting words from general use. Using these terms differently does not make them "wrong", it just makes them technical jargon specific to the field.
Within the field of computer science, one kilobyte = 1024 bytes. This isn't wrong, in fact, the other view (1 KB = 1000 bytes) IS wrong. It's wrong on several levels.
First, it attempts to use the wrong meaning of an overloaded word, rather than the one that is correct in context. You don't complain when a physicist talks about the color or spin of a fundamental particle, even if you know the particle has no "color" nor is it "spinning". The physicist isn't wrong, he's just using terms correctly in a physics context, where their meaning differs from other contexts. Or is the assertion here that physicists can co-opt words for their own meaning, but computer scientists are "wrong" for doing the same?
Second, it mistakes "byte" for an SI unit. According to SI, a kilometer is 1000 meters. This is very true. However, it's absolutely false that, according to SI, a kilobyte is 1000 bytes. SI has no more to say on how many bytes are in a kilobyte than it has to say on how many feet are in a mile, since neither miles nor bytes are SI units. Incidentally, kilobyte has traditionally been abbreviated KB, not the capital K. The SI "kilo" prefix is abbreviated with a small "k", but since the "kilo" in kilobyte is NOT the SI prefix, this is irrelevant.