Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Add test for CSRs and fix in simulation. #131

Closed
wants to merge 2 commits into from

Conversation

mithro
Copy link
Collaborator

@mithro mithro commented Dec 11, 2018

No description provided.

@mithro
Copy link
Collaborator Author

mithro commented Dec 12, 2018

@whitequark + @sbourdeauducq - Could you take a look at this and see if it makes sense to you?

yield self.re.eq(1)
for sc in reversed(self.simple_csrs):
yield from sc.write(value)
yield
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should probably be something like;

    for sc in reversed(self.simple_csrs):
        yield from sc.write(value)
    yield

Unsure if I tested if that version worked?

yield
yield self.re.eq(0)
yield # FIXME: Why two yields!?
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't figure out why two yields here...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me that having any yield statement in simulation code could be a problem. For example, if you wanted to update two different CSRs in the same clock cycle, if they both did this then your simulation would go forward several cycles.

Or is this the reason why you oversample the clock by 10x?

@xobs
Copy link
Collaborator

xobs commented Feb 19, 2019

I applied this patch, and was unable to get e.g. the valentyusb code to work.

However, I've managed to get CSRStorage(write_from_dev=True) to work with the following patch applied immediately before running simulation:

        for csr in self.dut.get_csrs():
            # print("csr: {}".format(csr))
            if isinstance(csr, CSRStorage) and hasattr(csr, "dat_w"):
                print("Adding CSRStorage patch for {}".format(csr))
                self.dut.sync += [
                    If(csr.we,
                        csr.storage.eq(csr.dat_w),
                        csr.re.eq(1),
                    ).Else(
                        csr.re.eq(0),
                    )
                ]
        run_simulation(
            self.dut,
            padfront(),
            vcd_name=self.make_vcd_name(),
            #clocks={
            #    "sys": 12,
            #    "usb_48": 48,
            #    "usb_12": 192,
            #},
            clocks={
                "sys": 2,
                "usb_48": 8,
                "usb_12": 32,
            },
        )

@enjoy-digital
Copy link
Owner

@mithro: i understand what you wanted to do but i don't think that's the right way to do it. The current write/read method for CSR (and wishbone, etc) simulate the behaviour that can is seen by the internal logic of the module during a CSR access, so the current write behaviour is correct for that goal.

In your case, you also want to integrate the CSR logic in your simulation so this is a bit different. To achieve that, i would collect all CSRs in the top module of the simulation, define them as submodules and create access methods based on these ones: https://github.com/enjoy-digital/litex/blob/master/litex/soc/interconnect/csr.py#L93

@enjoy-digital
Copy link
Owner

@mithro @xobs : i just added a basic test_csr with 6a4c133 to show you how to do it in your design with the current simulation methods and without the write_from_dev hack. The test can still be improved, but i think it will give you a better idea of what i was meaning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants