Computers Use Resources
Each operation of a computer (each step in the Fetch/Execute Cycle) takes a small amount of time
- As a rough approximation, assume 1 instruction per clock tick
- A 500MHz computer does 500,000,000 instructions in a second
Generally, the number of instructions needed to solve a problem determines how much time it will take …
- A 10 billion instruction computation takes 20 seconds
Networks have bandwidth limits: 100 Mb/sec
Every part of a computation takes memory, too.
Every letter or digit takes 1-2 bytes
Every instruction takes 4 bytes
Every integer takes 2 or 4 bytes
Every decimal number takes 4 or 8 bytes