Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
sim:scoreboards [2024/02/23 15:58]
fabian created
sim:scoreboards [2024/04/09 09:33] (current)
fabian
Line 5: Line 5:
 In order to enable this testing scheme in ACT, this simulation library contains the sub-namespace ''sim::scoreboard''. It (currently) provides three different scoreboards with different assumptions about the output. In order to enable this testing scheme in ACT, this simulation library contains the sub-namespace ''sim::scoreboard''. It (currently) provides three different scoreboards with different assumptions about the output.
  
-When setting up a testing environment, the scoreboard is the end point of the data pipeline. It consumes the outputs from either an oracle source (providing known good outputs from either a file or sequence) or a model implemented in CHP or external C as well as the design under test (DUT). This setup is similar to what is often seen in UVM/SystemVerilog; however, since there is no border between asynchronous or message parsing domain and a synchronous register transfer level domain, we do not require an additional driver/monitor combination to connect our DUT to the testing harness. Since ACT supports seamless interconnection of high level description all the way down to analog simulation, the same testing harness can be reused for high level evaluation, all the way down to post-layout verification.+When setting up a testing environment, the scoreboard is the end point of the data pipeline. It consumes the outputs from either an oracle source (providing known good outputs from either a file or sequence) or a model implemented in CHP or external Cas well as the design under test (DUT). This setup is similar to what is often seen in UVM/SystemVerilog; however, since there is no border between the asynchronous or message parsing domain and a synchronous register transfer level domain, we do not require an additional driver/monitor combination to connect our DUT to the testing harness. Since ACT supports seamless interconnection of high level description all the way down to analog simulation, the same testing harness can be reused for high level evaluation, all the way down to post-layout verification.
  
-It is important to know about the timing of the processes provided here, as it could influence how your design behaves. There are no input buffers on any of these scoreboards, which means that scoreboards will only proceed to the next test once *all inputs have received a token*. This is by design, so your testing harness can reflect whatever timing requirements your design might have or have to expect in when used in real life by adding a buffer to the input. We do however provide an [[sim:buffer | infinite capacity buffer]] to decouple your scoreboard completely.+It is important to know about the timing of the processes provided here, as it could influence how your design behaves. There are no input buffers on any of these scoreboards, which means that scoreboards will only proceed to the next test once *all inputs have received a token*. This is by design, so your testing harness can reflect whatever timing requirements your design might have or have to expect in when used in real lifeby adding a buffer to the input. We do however provide an [[sim:inf_buffer | infinite capacity buffer]] to decouple your scoreboard completely.
  
 Here's a simple example for a testing harness for a basic adder using a lockstep scoreboard and a sequence source as an oracle. We will go into depth about this type of scoreboard in a second: Here's a simple example for a testing harness for a basic adder using a lockstep scoreboard and a sequence source as an oracle. We will go into depth about this type of scoreboard in a second:
Line 47: Line 47:
 </code> </code>
  
-We connect use multi-ended sources to both the DUT as well as the scoreboard. If we had a model instead of an oracle, we would also connect the sources to it as well. Due to the simplicity of the design (adds 1 stage of slack), and the use of the oracle, we have omitted buffers on the scoreboard inputs.+We connect multi-ended sources to both the DUT as well as the scoreboard. If we had a model instead of an oracle, we would connect the sources to it as well. Due to the simplicity of the design (adds 1 stage of slack), and the use of the oracle, we have omitted buffers on the scoreboard inputs.
  
 We (currently) provide three different scoreboards based on different assumptions about the inputs and outputs of a DUT. We (currently) provide three different scoreboards based on different assumptions about the inputs and outputs of a DUT.
  
-   * If your design has a group of inputs and outputs, which always see the same number of tokens in the same order, use a lockstep scoreboard. An example would be an adder which produces as many results as it receives inputs. +   * If your design has a group of inputs and outputs, which always see the same number of tokens in the same order, use a ''lockstep'' scoreboard. An example would be an adder which produces as many results as it receives inputs. 
-   * If your design has a group of outputs, which always see the same number of tokens in the same order, use a deterministic scoreboard. An example would be a deterministic merge, where two input channels result in one output channel. +   * If your design has a group of outputs, which always see the same number of tokens in the same order, use a ''deterministic'' scoreboard. An example would be a deterministic merge, where two input channels result in one output channel. 
-   * If you design requires any other more intricate testing condition, can implement your own checks and use a generic scoreboard to have standard output formatting for automatic and distributed testing.+   * If you design requires any other more intricate testing condition, you can implement your own checks and use a ''generic'' scoreboard to have standard output formatting for automatic and distributed testing.
  
-If you need to log your inputs you can additionally use an input logger for any group of inputs which always see the same number of tokens.+If you need to log your inputsyou can additionally use an input logger for any group of inputs which always see the same number of tokens. 
 + 
 +===== Shared parameters =====
  
 The scoreboards in this namespace use the same parameters: The scoreboards in this namespace use the same parameters:
Line 64: Line 66:
    * ''SB_ID'': ID of the scoreboard to be used in log output    * ''SB_ID'': ID of the scoreboard to be used in log output
    * ''VERBOSE_TESTING'': If false, only failed tests will emit a message to the log    * ''VERBOSE_TESTING'': If false, only failed tests will emit a message to the log
 +
 +===== Interface =====
 +
 +All scoreboards share a common interface. Some ports might be missing depending on the capability of a given scoreboard. If a port is not present on every scoreboard type, it is denoted as such.
 +
 +   * ''IN[IN_CHANNELS]'': Input token channels (only used on ''lockstep'' and ''input_logger'')
 +   * ''OUT_M[OUT_CHANNELS]'': Tokens outputted from the model or oracle (not present on ''input_logger'')
 +   * ''OUT_D[OUT_CHANNELS]'': Tokens outputted from the model or oracle (not present on ''input_logger'')
 +   * ''SUCCESS'': Whether or not the incoming output data resulted in a successful test (only present on ''generic'')
 +
 +The exported processes are:
 +
 +<code act>
 +export template <pint D_WIDTH, OUT_CHANNELS, SB_ID; pbool VERBOSE_TESTING>
 +defproc generic (chan?(int<D_WIDTH>) OUT_M[OUT_CHANNELS], OUT_D[OUT_CHANNELS]; chan?(bool) SUCCESS);
 +
 +export template <pint D_WIDTH, IN_CHANNELS, OUT_CHANNELS, SB_ID; pbool VERBOSE_TESTING>
 +defproc lockstep (chan?(int<D_WIDTH>) IN[IN_CHANNELS], OUT_M[OUT_CHANNELS], OUT_D[OUT_CHANNELS]);
 +
 +export template <pint D_WIDTH, OUT_CHANNELS, SB_ID; pbool VERBOSE_TESTING>
 +defproc deterministic (chan?(int<D_WIDTH>) OUT_M[OUT_CHANNELS], OUT_D[OUT_CHANNELS]);
 +
 +export template <pint D_WIDTH, IN_CHANNELS, SB_ID; pbool VERBOSE_TESTING>
 +defproc input_logger (chan?(int<D_WIDTH>) IN[IN_CHANNELS]);
 +</code>