Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compile Order Issues #577

Open
nfrancque opened this issue Oct 26, 2019 · 9 comments
Open

Compile Order Issues #577

nfrancque opened this issue Oct 26, 2019 · 9 comments
Labels

Comments

@nfrancque
Copy link

I am working with a codebase that splits out entity files and architecture files to support multiple architectures. So, for many files there is file.vhd for entity declaration, and file-rtl.vhd, and file-fpga.vhd for instance. The "-x" files contain component declarations for their dependencies and do not instantiate with the format "entity lib_name.entity_name".

Vunit seems to be having a difficult time parsing out the compile order. For the most part I see the global flow is correct (libraries with basic blocks are compiled before higher level libraries), but it will always decide to compile the testbenches before their corresponding design files. And intra-library dependencies don't seem to be working at all.

If I turn off BindAtCompile in questasim everything ends up working out okay, so this is a valid workaround. But I like being able to support BindAtCompile to fix issues sooner.

My assumption is that the parsers are not either set up to deal with the split entity/architectures, or that component declarations are less well supported than the newer method. I'd be happy to contribute a fix if someone could point me in the right direction.

@eine
Copy link
Collaborator

eine commented Oct 26, 2019

My assumption is that the parsers are not either set up to deal with the split entity/architectures,

I don't think this is the case, since the coding style suggests to do so when multiple architectures exist: http://vunit.github.io/contributing.html#coding-style.

or that component declarations are less well supported than the newer method.

This might be the issue, althoug I didn't really understand the setup. A MWE would help.

I'd be happy to contribute a fix if someone could point me in the right direction.

The point is that VUnit's parser, which is supposed to be just the internal implementation of a dependency scanner, is known to be limited but to work well enough. There have been other recent dicussions that you might want to have a look at: #529, #532. Related sources are https://github.com/VUnit/vunit/blob/master/vunit/vhdl_parser.py and https://github.com/VUnit/vunit/blob/master/vunit/dependency_graph.py, but it'd be good to discuss solutions here, before working on a PR.

@nfrancque
Copy link
Author

Interesting, thanks for the links.

The setup is as simple as you might expect, separate entity/architecture and explicit component declarations in the architectures. I don't think there is anything special but will look. Unfortunately not able to supply any of the code, have been looking through vunit logs and it seems to pick up on quite a few dependencies between entity/architectures as well as testbenches. So I'm at a bit of a loss as to what is causing the issue. Could probably provide an obfuscated log of the dependencies it finds? Codebase is several hundred files so MWE may be relatively large.

Your idea about the custom parsers sound promising, but in my opinion dependency tracking is one of the biggest draws of vunit. In fact in this case I am using it exclusively to compile a design where the test infrastructure is managed externally. There is also a large use case for simply using it for the ability to generate a compile script by parsing out --export-json results, but it leaves off some information (vhdl_standard mostly) that would make this feasible.

So I think it would be important to prioritize reliable parsing and dependency tracking out of the box going forward, rather than requiring separate tools to generate a good file list.

@eine
Copy link
Collaborator

eine commented Oct 26, 2019

Unfortunately not able to supply any of the code, have been looking through vunit logs and it seems to pick up on quite a few dependencies between entity/architectures as well as testbenches. So I'm at a bit of a loss as to what is causing the issue. Could probably provide an obfuscated log of the dependencies it finds? Codebase is several hundred files so MWE may be relatively large.

Note that we don't need a reproducer based on your exact code, but just a few lines that reproduce your problem. If the issue is with having explicit component declarations in the architectures, just 2-4 files with dummy entities and architecture bodies will suffice. You can make up the names too. I think this is easier to achieve than to obfuscate the logs.

Your idea about the custom parsers sound promising, but in my opinion dependency tracking is one of the biggest draws of vunit. In fact in this case I am using it exclusively to compile a design where the test infrastructure is managed externally.

I partially agree, but note that VUnit is neither a dependency scanner nor a build tool. It is a test runner and a set of libraries for logging and verification. While I like to use it for the same purpose you describe, we need to be aware of what the focus is. In this specific case, I understand why you want to use bindAtCompile, but you can use VUnit without it, isn't it?

There is also a large use case for simply using it for the ability to generate a compile script by parsing out --export-json results, but it leaves off some information (vhdl_standard mostly) that would make this feasible.

On the one hand, you might want to extract the compile order from the Python interface, instead of using the --export-json feature. Writing a dictionary to a JSON file is a matter of using a function.

On the other hand, there is an on-going discussion in #574 about alternatives to get information through Results or post_run.

Last, if you think that adding the version of the standard to the JSON export is useful, please open an issue about it.

So I think it would be important to prioritize reliable parsing and dependency tracking out of the box going forward, rather than requiring separate tools to generate a good file list.

While I completely agree, the main point is that parsing VHDL is very hard. Precisely, there was a very interesting discussion about it recently: October 16, 2019 11:00 PM. You might see that there is currently no valid Python library that we can use. And, as @kraigher commented, we'd rather integrate rust_hdl than put much effort on VUnit's python parser.

@eine eine added the Parsing label Oct 26, 2019
@kraigher
Copy link
Collaborator

Split entity/architectures are no problem. However component instantiation with bindAtCompile is a problem since VUnit assumes bind at elaboration. With bind at elaboration there is no compile order depdency between the instantiating entity and the one that ends up getting mapped to the component instantiation at elaboration.

Thus bindAtCompile is not supported currently. You can always add a manual dependency as a work around.

We can use this thread to investigate if and how it could reasonably be supported. It would require a global option so trigger the behavior. It would also require VUnit to resolve the default component binding. Resolving the default component binding is not really a parsing problem but a semantic analysis problem.

Could you also explain why one would use bindAtCompile instead of just doing entity instantiation directly? It will result in the same behavior and you loose the flexibility of component instantiation anyway with bindAtCompile. Are you relying on default component binding?

Note that I am on a trip for a few days and might be unresponsive.

@nfrancque
Copy link
Author

Resolving the default component binding is not really a parsing problem but a semantic analysis problem.

What makes you say that? Couldn't we simply add all entities matching the component declaration as a dependency? It would then be up to the user to ensure that the ports actually match, I don't think that's too big of a requirement since they should be doing that anyways.

Could you also explain why one would use bindAtCompile instead of just doing entity instantiation directly? It will result in the same behavior and you loose the flexibility of component instantiation anyway with bindAtCompile. Are you relying on default component binding?

It's a legacy codebase where component declarations were part of the style guide. Entity declarations are just now catching on for us, the vast majority of existing code uses components.
I agree that it ends up the same but unfortunately not able to make those kinds of changes to our existing code, which is why we use bindAtCompile. Yes, it all relies on default component binding.

Essentially, you hit the nail on the head. We are trying to arrive at the same behavior as entity instantiation without having to modify all the code.

@nfrancque
Copy link
Author

nfrancque commented Oct 26, 2019

I see this line that is most likely what we're looking at, please correct me if not.

_component_re = re.compile(
r"[a-zA-Z]\w*\s*\:\s*(?:component)?\s*(?:(?:[a-zA-Z]\w*)\.)?([a-zA-Z]\w*)\s*"
r"(?:generic|port) map\s*\([\s\w\=\>\,\.\)\(\+\-\'\"]*\);",
re.IGNORECASE,
)

It looks like it's searching component instantiations? I don't quite understand the goal, the word component is just in the declaration which wouldn't contain a port map.

I would think it'd be enough to just search for the declaration and that might be an easier parse.

Can you help me understand the current state of this feature?

@nfrancque
Copy link
Author

Did some more digging. It looks like project.py only searches in the design file's own library for component dependencies. This explains why I was seeing testbench libraries compiled before their corresponding design library. (Does anyone know if that is correct? Wasn't sure if I was misinterpreting the code).
Using the following example:

library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_unsigned.all;        -- for addition & counting
use ieee.numeric_std.all;               -- for type conversions

library foo;

entity TEST is
  port (
    clock  : in  std_logic;
    output : out std_logic
  ) ;
end TEST;

architecture rtl of  TEST is

  component bar is
  port (
    clock    : in  std_logic;
    output   : out std_logic

  );
  end component bar;



begin

  U_BAR : bar 
  port map (
    clock => clock,
    output => open

  );

end  rtl;

I would like to return from vhdl_parser a list of referenced libraries, whether there is a corresponding package used from it or not, and then have the project search through each of those libraries for a corresponding entity. (In this case, search library foo and find entity bar).

Obviously searching through all libraries would also work if adding this is too big of a change, but I'd assume that would take a lot longer.

I would agree that this could be enabled as an optional feature since you could see a decent performance hit from doing it. It looks like implementation_dependencies controls the searching of components already and is disabled by default, so we should be able to leave it this way.

Thoughts?

@nfrancque
Copy link
Author

Turns out parsing VHDL is indeed very, very hard. I have most of this working locally - search all of file's referenced libraries for entities matching component declaration. It improved things significantly, almost all of project compiles with -bindAtCompile enabled.

I can't cover the case of declaring a component and then in the instantiation simply saying

U_NAME : component_name and then letting the configuration handle the rest with any real confidence the regex would only pick up on that thing. Easiest solution would be look at the declaration rather than the instantiation and assume that it is instantiated.

Would like to continue this discussion on either improving the parser or migrating to another.

@kraigher et al.

@javValverde
Copy link

I'm also struggling with this issue.

Our code base has tons of component instantiations, and we cannot change them.
We have a self developed dependency scanner which works for our code base. Is there a way to use our own dependency scanner?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants