Historically, strings in programming languages were just collections of bytes—that is, binary numbers. In these collections of bytes, each byte corresponded neatly to one character. A standard system, known as ASCII, was created; it mapped these raw bytes to human-readable characters. So it was decided that, for example, 65 would be an uppercase A, 119 would be a lowercase w, and so on. The string abc123, then, would be composed of the following bytes:
97, 98, 99, 49, 50, 51
This simple one-to-one mapping from bytes to characters was convenient. Operations that from a programmer’s perspective worked with characters could, under the hood, work with bytes. “Give me the first character of this string” ...