Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Avoiding timing issues in a processor

I'm writing a simulation of a (very slow and primitive) processor.

For example: let's say the clock speed is a constant 1 hz. I assume this means that 1 instruction can/will be processed every second. Some instructions take longer than others. Adding 1 + 0 takes less time than 1 + 7. (The latter causes a ripple of carry bits which takes a non-zero amount of time.)

I need to be able to carry out instructions only after all other instructions have finished.

Do I need to:

  1. time how long the longest instruction takes and set the clock speed to greater than that?
  2. create a stateful watcher that won't allow future a instruction to be executed until the previous is complete
  3. Am I misunderstanding the problem completely?

In #1, it seems like I'm still risking a race condition of instruction being incomplete before the next begins. In #2, it seems like I'm risking an unpredictable/variable clock speed which could cause me issues later.

How can I resolve this? Are there any hints in how a real processor handles this issue?

like image 412
Dinah Avatar asked Dec 06 '25 05:12

Dinah


2 Answers

Firstly, processors do a single set of micro instructions per clock cycle, these usually involve things like switching a bus into a register or the ALU (arithmetic logic unit), the next micro instruction might clock the register or ALU to do something with the data on a bus. Most assembly level instructions are built up with a series of micro instructions. An addition instruction only takes a few micro instructions, but dividing may take many more. Most micro controllers document how many cycles each assembly level instruction takes.

With more sophisticated microcontrollers there is also an instruction pipeline (as cat mentions), which means the processor can start doing part of the next instruction before the previous on has completed. This can become very complicated with concepts like predictive branching.

Typically when you simulate digital electronics you use an event based model as electronic systems are concurrent, but also have propagation delays that need to be modeled. I remember using tools like PSpice and MicroSim at Uni that did this very well.

like image 92
tarn Avatar answered Dec 09 '25 15:12

tarn


Are you familiar with the instruction pipeline ?

like image 27
Cat Zimmermann Avatar answered Dec 09 '25 15:12

Cat Zimmermann



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!