Hi all,
I have a query regarding state machines that I hope someone can clear up.
Basically I have been writing state machines by following a method that a workmate showed me for some time and I am thinking now that this method may be incorrect as I did notice some strange things in timing simulation.
I have been studying how they should be written and from reading an document by altera on recommended coding styles. (http://www.altera.com/literature/hb/qts/qts_qii51007.pdf?GSA_pos=1&WT.oss_r=1&WT.oss=coding style). On page 6-66 the state machine example can be found, but to save cluttering, I will summerize my doubt...
There are two processes, one which assigns the next state to the current state on each clock edge and the other which reads the inputs and modifies the outputs and assigns next state. In the sensitivity list of the process which calculates the next state contains the inputs, which could change at any point in time, now suppose they change more than once during a clock period and that they directly affect what the next state may become...what would happen if one of the inputs change very close to the clock edge, thus changing next_state. When the clock edge arrived and the frist process goes to assign the next state to the current state, if next state is in the process of changing, is there a possibility that the signal holding next state could be read metastable and thus transferred to the current state, which could cause a failure?
Previously I had written state machines using just one process, to read inputs, assign outputs and assign the state all on 1 clk edge. I think maybe this was bad practice, but I don´t understand why there would eb a problem with it´.
I´d be grateful if someone could clear this up for me.
Many Thanks
I have a query regarding state machines that I hope someone can clear up.
Basically I have been writing state machines by following a method that a workmate showed me for some time and I am thinking now that this method may be incorrect as I did notice some strange things in timing simulation.
I have been studying how they should be written and from reading an document by altera on recommended coding styles. (http://www.altera.com/literature/hb/qts/qts_qii51007.pdf?GSA_pos=1&WT.oss_r=1&WT.oss=coding style). On page 6-66 the state machine example can be found, but to save cluttering, I will summerize my doubt...
There are two processes, one which assigns the next state to the current state on each clock edge and the other which reads the inputs and modifies the outputs and assigns next state. In the sensitivity list of the process which calculates the next state contains the inputs, which could change at any point in time, now suppose they change more than once during a clock period and that they directly affect what the next state may become...what would happen if one of the inputs change very close to the clock edge, thus changing next_state. When the clock edge arrived and the frist process goes to assign the next state to the current state, if next state is in the process of changing, is there a possibility that the signal holding next state could be read metastable and thus transferred to the current state, which could cause a failure?
Previously I had written state machines using just one process, to read inputs, assign outputs and assign the state all on 1 clk edge. I think maybe this was bad practice, but I don´t understand why there would eb a problem with it´.
I´d be grateful if someone could clear this up for me.
Many Thanks