How many bits must be “flipped” (i.e., changed from 0 to 1 or from 1 to 0) in order to capitalize a lowercase ‘a’ that’s represented in ASCII?
Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga.
Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus.
Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
Do you know the decimal value for each of the characters?
Take the decimal equivalent to your ascii, convert to binary, compare the results.
it's more useful to start from hex values:
A <=> 41 <=> 0100 0001 a <=> 61 <=> 0110 0001
B <=> 42 <=> 0100 0010 b <=> 62 <=> 0110 0010
Z <=> 5A <=> 0101 1010 z <=> 7A <=> 0111 1010
just toggle b5 from 0 to 1 to transform uppercase to lowercase or toggle b5 from 1 to 0 to transform from lowercase to uppercase: that's all
Not the answer you are looking for? Search for more explanations.
going from lowercase to uppercase in ascii is equivalent to subtracting 32 from the decimal notation of the lowercase letter. 'a' = 97, 97-32 = 65 = 'A'. Subtracting 32 is equivalent to doing an XOR with 00100000, with the possibility of a carry bit in the case that you XOR a 0 with a 1, which could in turn trigger a bit flip, which means that at most the 3 most significant bits could flip. However, the lowercase alphabet in ASCII is only 97-122 in decimal. That entire range begins with 011 in binary, thus subtracting 00100000 will always only flip the '32' place bit. So not only for 'a', but for all 26 letters in the english alphabet, exactly 1 bit must be flipped to capitalize it in ASCII.