Module review

  1. Computers understand and operates 0's and 1's.

  2. Humans cannnot efficiently codify instruction to be passed to a computer in 0's and 1's so they use computer programming language.

  3. Computer programming languages use English-like constructs, special signs and symbols instead of 0's and 1's.

Last change: , commit: ed464b9