Computers Use Resources
v Each operation of a computer (each step in the
Fetch/Execute Cycle) takes a small amount of time
; As a rough approximation, assume 1 instruction per clock tick
; A 500MHz computer does 500,000,000 instructions in a second
v Generally, the number of instructions needed to solve
a problem determines how much time it will take …
; A 10 billion instruction computation takes 20 seconds
v Networks have bandwidth limits: 100 Mb/sec
v Every part of a computation takes memory, too.
… And everything
is limited by the
speed of light
Every letter or digit takes 1-2 bytes
Every instruction takes 4 bytes
Every integer takes 2 or 4 bytes
Every decimal number takes 4 or 8 bytes