In computer programming,computer code refers to the set of instructions,or a system of rules,written in a particular programming language.It is also the term used for the source code after it has been processed by a compiler and made ready to run on the computer.
Binary Coded Decimal(BCD) code is one of the early computer codes.In BCD code,each digit of a decimal number is represented by its binary equivalent instead of converting the entire decimal value to a binary number.It is a form of binary encoding where each digit in a decimal number is represented in the form of bits.This encoding can be done in either 4-bit or 8-bit (usually 4-bit is preferred).BCD plays an important role because the manipulation is done treating each digit as a separate single sub-circuit.For Ex:-0.2 in binary is .001100...and in BCD is 0.0010.It avoids fractional errors and is also used in huge financial calculation.
Another widely used computer code is the American Standard Code for Information Interchange(ASCII).American National Standards Institute(ANSI)
published ASCII standard in 1963.Today,ASCII is one of the most popular and widely supported character encoding standards.
ASCII is of two types:-ASCII-7 AND ASCII-8.
ASCII-7 is a 7-bit code that can represent 128 different characters.Computers using 8-bit byte(group of 8 bits for 1 byte) and the 7-bit ASCII either set the 8th bit(left most bit) of each byte as zero or use it as a parity bit.
ASCII-8 is an extended version of ASCII-7.It is an 8-bit code that can represent 256 different characters.It adds the additional bit to the left of the 7th bit (left most bit) of ASCII-7 codes.
ASCII also uses hexadecimal as its four-to-one shortcut notation for memory dump.
Unicode is a universal character-encoding standard used for representation of text for computer processing.It is an encoding system that "provides a unique number for every character,no matter what the platform,no matter what the program, no matter what the language".
Unicode supports more than a million code points,which are written with a "U" followed by a plus sign and the number in hex.
- It provides a consistent way of encoding multilingual plain text.
- It defines codes for characters and for special characters also.
- It has the capacity to encode as many as a million characters.
- It affords simplicity and consistency of ASCII.
- It specifies an algorithm for presentation of text with bi-directional behavior.
Unicode Encoding Forms:-
Unicode standard defines following three encoding forms for each character:
- UTF-8 (Unicode Transformation Format-8).
- UTF-16 (Unicode Transformation Format-16).
- UTF-32 (Unicode Transformation Format-32).