Module review
-
Computers understand and operates 0's and 1's.
-
Humans cannot efficiently codify instructions to be passed to a computer in 0's and 1's, so they use computer programming languages.
-
Computer programming languages use English-like constructs, special signs and symbols instead of 0's and 1's.