Module review
-
Computers understand and operates 0's and 1's.
-
Humans cannnot efficiently codify instruction to be passed to a computer in 0's and 1's so they use computer programming language.
-
Computer programming languages use English-like constructs, special signs and symbols instead of 0's and 1's.