FPGA project structure definition

S

saar drimer

I've written up an (informal) draft proposal for an FPGA project
structure that could be easily extended as the project grows and is
version control friendly. I'd be grateful for any type of feedback...

http://www.saardrimer.com/fpgaproj/

cheers,
saar.
 
J

Jonathan Ross

I've written up an (informal) draft proposal for an FPGA project
structure that could be easily extended as the project grows and is
version control friendly. I'd be grateful for any type of feedback...

 http://www.saardrimer.com/fpgaproj/

cheers,
saar.

First some minuta. The figures aren't labeled, so it's hard to target
comments for one. The first figure, "The 'flow'", doesn't have a
"Build Scripts" as a source file type. I realize you had planned to
make a distinction between scripts and source, but our build scripts
are checked into our repository under the Philosophy that any checkout
should be buildable as is, and we consider them part of our source.
Also, testbenches should make mention of Unit Testing in hardware -
it's automated and should be part of a mature build cycle. Scoping/
Tapping on the other hand is part of the development process so
doesn't necessarily need to be mentioned here.

I was in the study phase of implementing an SCons Builder/Scanner for
XST/VHDL for my build cycles. Requiring that VHDL files share the
entity name would make scanning dramatically easier; as I control our
own internal standards I'll make this a requirement, along with
configurations and components as well.

You might consider requiring that source files fall under a directory
with the same name as the library they're in. For example, if the
entity Foo was part of library work, and Bar was part of library play,
the directory structure look like so:

.../Project/sources/hdl/work/Foo.vhd
.../Project/sources/hdl/play/Bar.vhd

We don't do any Verilog development so I'm not sure how the concept of
a library is handled there.

Another issue we have is that a lot of our sub-components should be
accessible by some engineers, but not all engineers. By writing SCons
SConstruct and SConscript files and using BuildBot, the idea was that
an engineer could check in their sub-project and the server would
initialize a build over the whole project while keeping components
isolated to their respective developers. That is - our organizational
issue is as follows: each component should have two levels of access -
one public for declaration, one private for specification; if a team
was designing an Ethernet controller we'd want the entity, component,
configurations, behavioral simulations, and packages that define the
types needed to interface with it to be available to everyone, while
we'd want the implementation structure hidden. This would require a
public/private fork of the directory structure under mercurial to
accomplish, i.e.

.../Project/public/source/hdl/work/Foo.vhd
.../Project/private/source/hdl/work/Bar.vhd

Then there would be a mercurial repository at Project, and public and
private would be sub-repositories.

Further, since mercurial allows pre and post hooks for all commands,
the plan was to preempt any push to a stable repository on the server
with an initiation of Unit Tests across the entire project (sub and
sup modules...) that must be passed before it can successfully be
pushed. The issue is that some modules will be used in multiple
projects. We have yet to figure out the optimal method of doing this.
I.E.

Project1/submodules/Project3/...
Project2/submodules/Project3/...
Project3/...

Pushes to Project3 should automatically initiate unit testing for
Project1 and Project2. I have no clue how to do this effectively and
efficiently.

Overall though, I like the proposal. If I can make it fit my need for
public/private portions of submodules I'll definitely use it. Thanks.

~Jonathan Ross
 
P

Pontus

Great, thanks for sharing.

I have set up an FPGA project organization which ended up quite
similar to yours.
The build dir is introducing (to me) a new level which seems to be a
good idea,
we used sim, synt, par dirs in paralell to the src, doc etc.
We don't have the products dir, instead products (of specific
revisions) are manually archived on a separate archive server.

Also our constraint files have been lying in the synt ans par dirs,
but I like the Idea of just removing the
entire build dir to do a clean.
However I also kind of like to do "cd synt; make" (or "make -C synt")
to get the synthesis done.
So I need to keep at least the makefile when doing a clean.

Often the project has several "products" built from the same sources,
e.g. a "board_test" bit file for production tests,
and possibly some bit file to develop or debug a specific part or
function of the board. Further down the road you may end up
with having to support different FPGAs (i.e different speed grades/
sizes).
Another aspect is when using the project as a submodule, you may want
to be able to publish several variants (16/32, master/slave, etc.)
All this made us come up with the concept of a "component" of a module
(for the lack of a better word (component is probably the most
overloaded word in HW design)).
In our system a module may have several components, e.g. "board_test",
"ddr_debug", "small_fpga" etc.

During build:
Usually there are many warnings in the log files of quite different
severity.
An approach I have taken is to view the log file as the primary goal
in the makefile (for both sim, synt and par).
Or actually a filtered log file obtained by running a (bash) script
using "grep -v" to remove "known and accepted" warnings.
Nothing should be left after the filter has worked on the file. The
filter also does some simple statistics such as reporting
the number of warnings filtered out etc. The filter is setup with a
specific "component" control file that lists all acceptable warnings.
This control file is also an excellent place to document why specific
warnings are acceptable.
So e.g. the (synt result) .edi file is obtained as a side effect of
wanting a filtered synthesis log file.
The filter can have make fail if wished (we currently don't do that).

So the directories we have at the module level is src/ synt/ sim/ par/
doc/ submodule_1/ and design/
"make -C src" will compile (for simulation) all src code including any
generated "macros" from the fpga vendor
"make -C sim" will run a set of batchmode simulations generating
filtered log files which can be inspected.
"make -C synt my_comp" will synt the component my_comp (or actaully
aim to generate my_comp's filtered log file)
"make -C par my_comp" will {build; map; par; trace} my_comp by [again]
aim for a filtered log file.
"make all" in the module top will do all four steps above.

Finally in design/ we keep all build scripts and makefiles and a file
to setup the environment, i.e. paths to all
tools used. In that way we also check in which tools and their
versions that were used for a given project at a given time.

One thing I have noted is that implementing "make help" in each
directory has made the system much more user friendly.

-- Pontus
 
A

Alessandro Basili

I've written up an (informal) draft proposal for an FPGA project
structure that could be easily extended as the project grows and is
version control friendly. I'd be grateful for any type of feedback...

http://www.saardrimer.com/fpgaproj/

cheers,
saar.

First of all I'd like to point out that your proposal looks consistent
and in some aspects very elegant. I like the idea of having a common
structured approach everyone understands and follows rigorously and
methodically, in order to promote reuse and modular design.

On the contrary I believe the effort becomes huge when you need to deal
with FPGA vendors, since every one is pushing their own product and
through their own Integrated Development Environment they tie down the
designers to their own structure.

As already posted, I also like the idea of cleaning a build in one go,
but I think that a constraint file is pretty much different from an hdl
file and collecting them under /source will make a lot of confusion,
especially when you are interested in developing rtl while some other
people are interested in adding constraints for the implementation.
Dividing the dir tree in processes has the big advantage that even
though you are the only one designer, you make your project flowing, in
several reiterations, with the processes. Once you are happy with the
simulation and the rtl you would move into synthesis and so on (with a
good chance you need to come back to your rtl).

In the end I believe that having a structure, regardless the type, is
the most important thing, since nothing can be more confusing than
having no organization. I sometimes stare for minutes at the list of
directory names my group is creating and believe that no matter what
structure you want to propose their capability to screw it up is greatly
above any imagination (I still remember my very first vhdl project soon
moved from directory /test to directory /final and then /reallyfinal and
then /reallyfinal2...).
 
S

saar drimer

Thank you Jonathan, Pontus and Alessandro for your comments and
suggestions -- I will consider them for the next revision of the
document.

In the end I believe that having a structure, regardless the type, is
the most important thing, since nothing can be more confusing than
having no organization.

That's the main point, yes. The primary motivation for writing this
specification was to provide a starting point for engineers, together
with the reasoning behind the choices that I have made. Obviously,
there's no way for a single structure definition to fit everyone's
needs. I do think, though, that within the rigid rules of this
proposal, there's enough flexibility for customization without loss of
most benefits.

Further comments and suggestions are welcome!

cheers,
saar.
 
B

Benjamin Couillard

Thank you Jonathan, Pontus and Alessandro for your comments and
suggestions -- I will consider them for the next revision of the
document.



That's the main point, yes. The primary motivation for writing this
specification was to provide a starting point for engineers, together
with the reasoning behind the choices that I have made. Obviously,
there's no way for a single structure definition to fit everyone's
needs. I do think, though, that within the rigid rules of this
proposal, there's enough flexibility for customization without loss of
most benefits.

Further comments and suggestions are welcome!

cheers,
saar.

I was just wondering, with your project structure, where would vendor-
specific cores would fit in? Let's say I use core generator to
generate a FIR filter. Coregen generates a .xco file along
with .ngc, .mif .vhd (for simulation), etc. Where would you put it in
your structure and what files would you add to your version control.
Technically, you only need the .xco file, and maybe the .coef file the
regenerate your core. However, if you have a lot of cores, it might
take a lot of time to regenerate the missing core files.
 
S

saar drimer

I was just wondering, with your project structure, where would vendor-
specific cores would fit in? Let's say I use core generator to
generate a FIR filter. Coregen generates a .xco file along
with .ngc, .mif .vhd (for simulation), etc. Where would you put it in
your structure and what files would you add to your version control.
Technically, you only need the .xco file, and maybe the .coef file the
regenerate your core. However, if you have a lot of cores, it might
take a lot of time to regenerate the missing core files.

Some of this is already covered in the document; see the "build
environment" section. This is the Makefile bit (I've noted that I'm
inconsistent with "/source" and "/sources", and will fix it in the
next revision):

CGP = fifo

# Generate black boxes
# (Unfortunately, CoreGen will generate the core where the .cgp and
# .xco files are, NOT from where it is invoked. There is no output
# path directive either. Hack is to copy the source files first)
%.v %.cgp:
cp ../sources/blackbox/$(CGP).cgp $(SYN)/$(CGP).cgp
cp ../sources/blackbox/$(CGP).xco $(SYN)/$(CGP).xco
cd $(SYN) && coregen -p $(CGP).cgp -b $(CGP).xco

bbox: $(SYN)/$(CGP).v $(SYN)/$(CGP).cgp

I'm suggesting that the required source files (in this case, .xco
and .cgp) be put in version control at "/sources/blackbox" (for a
single core). An FIR filter and FIFO source files could sit in "/
sources/blackbox/fir_filter", and "/sources/blackbox/fifo",
respectively. "blackbox" isn't special -- they could similarly be
placed in "/sources/fir_filter", and "/sources/fifo", and the Makefile
changed accordingly. Another valid option is to consider these cores
as submodules with a build of their own, though I decided not to do
that because these modules typically aren't significant enough to
justify it.

Generated files (.vhd/.v, .ngc, etc.) are placed in "/build/synthesis/
<core name>/", for example. The Makefile should know when to
regenerate the required files if they are missing. Indeed, generating
the .ngc files take a long time, so "/build/synthesis/<core name>/"
shouldn't be deleted too often... this can also be set up by the
Makefile (i.e., selective clean of the various generated directories,
as I have it in the example Makefile in the document for "syn" and
"imp").

cheers,
saar.
 
P

Pontus

Considering that you may want to use a module as a submodule,
parent modules will need to be able to find your submodule's
generated macros (black-boxes) for their build (sim/synt/par).

If you start copying files within a submodule you will not
succeed cleaning it (from a parent modules view), or you need
to clean the copies as well. Which one should the parent use?
The original, or the copy?

I found that, as long as I know where the submodule's generated
items are stored, I could, for simulation, use a configuration
to override paths using generics [vhdl]. For par (ngdbuild)
use -sd to point out the macro. Synthesis treats it as a blackbox
but I guess timing to/from the blackbox could/should be possible?

In the cleaning process I excluded removing any macros (since they
are quite time consuming to regenerate), however the build process
still requires the macro-control-file (e.g. .xco) to be older than
the macro-result-file (e.g. .mif, .ngc etc.).

-- Pontus
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,159
Messages
2,570,884
Members
47,419
Latest member
ArturoBres

Latest Threads

Top