Trivia: Where are you on the HDL Map?

"cfelton" <cfelton@n_o_s_p_a_m.n_o_s_p_a_m.ieee.org> writes:

Duration HDL Target Tech
----------------------------------
9 Verilog FPGA
6 Verilog FPGA
6 Verilog ASIC
2 Verilog/SV FPGA
3 Verilog ASIC
6 VHDL FPGA
3 VHDL FPGA
5 VHDL FPGA
3 Verilog FPGA
3 MyHDL FPGA
3 Verilog FPGA
3 Verilog ASIC
Thanks for that list Chris - would you be able to comment any more on
the MyHDL project? What's it (MyHDL) like as a design environment?

Cheers,
Martin

--
martin.j.thompson@trw.com
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.co.uk/capabilities/39-electronic-hardware
 
<snip>
Thanks for that list Chris - would you be able to comment any more on
the MyHDL project? What's it (MyHDL) like as a design environment?

Cheers,
Martin
MyHDL is a Python module/library for Python. It adds RTL to the Pytho
language. You can simulate your RTL design in Python, then convert t
Verilog/VHDL for synthesis. You can co-simulate MyHDL (python) with th
Verilog/VHDL.

You would use the same tools (editors, debugger, etc) that you use with an
Python development.

I find MyHDL very useful in developing IP. You might have many flavors o
a particular IP, Python can be useful for managing it all. Things lik
managing larger number of registers, in Python you can easily dump all th
registers to CVS, HTML, etc. Without having to have an external tool t
parse etc.

For information see the Why MyHDL http://www.myhdl.org/doku.php/why.

Hope that helps,

.chris

---------------------------------------
Posted through http://www.FPGARelated.com
 
"cfelton" <cfelton@n_o_s_p_a_m.n_o_s_p_a_m.ieee.org> writes:

snip

Thanks for that list Chris - would you be able to comment any more on
the MyHDL project? What's it (MyHDL) like as a design environment?

Cheers,
Martin



Hope that helps,
Thanks Chris - I should've been clearer in my question, sorry!

I have some familiarity with MyHDL, and a lot with Python.

FWIW, last time I tried it (for fun, not work - and probably version
0.5), it seemed to me that MyHDL was pretty low-level (on a par with
Verilog) at the "synthesisable" end - bags of bits, arithmetic, but
not attempting to turn complex data structures into HDL. But you *do*
have the benefit of a hugely powerful language at the testbench end.

Reading your other post (which I'd missed when I asked the question),
it sounds like MyHDL is your preferred design environment - is it the
value of Python in the testing phase you see as being paramount? Are
there also compelling "RTL-level" benefits also (compared to VHDL)?

Thanks,
Martin

--
martin.j.thompson@trw.com
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.co.uk/capabilities/39-electronic-hardware
 
<snip>
I have some familiarity with MyHDL, and a lot with Python.

FWIW, last time I tried it (for fun, not work - and probably version
0.5), it seemed to me that MyHDL was pretty low-level (on a par with
Verilog) at the "synthesisable" end - bags of bits, arithmetic, but
not attempting to turn complex data structures into HDL. But you *do*
have the benefit of a hugely powerful language at the testbench end.

Reading your other post (which I'd missed when I asked the question),
it sounds like MyHDL is your preferred design environment - is it the
value of Python in the testing phase you see as being paramount? Are
there also compelling "RTL-level" benefits also (compared to VHDL)?
Yes, mainly having Python as an environment and the power it provides fo
testbenching. As you eluded, MyHDL is an RTL. It is not a higher-leve
abstraction, it is at the same level as Verilog/VHDL (no intention of bein
a higher abstraction level).

I believe it adds a lot of power (features that are being added i
SystemVerilog and VHDL 2008). I also believe that MyHDL, in many cases
combines the best of both words, Verilog and VHDL.

In my case, I can directly generate a complex stimulus, send it to my UUT
collect the results, analyze all in the same environment (script).
Previously, I would generate this in Matlab, export to a file or try an
use the clumsy (and expensive) HDL/sim interfaces, write to a file, impor
back ...

Now I can let my designs simulate and I have all my signal processin
analysis waiting for me at the end. Or I can interactively (in a shell
run the simulation. If I have an FFT core, I can call (run x cycles) an
get the output, plot the output, etc

Hope that helps,
Chris

---------------------------------------
Posted through http://www.FPGARelated.com
 
<snip>
Yes, mainly having Python as an environment and the power it provides for
testbenching. As you eluded, MyHDL is an RTL. It is not a higher-level
abstraction, it is at the same level as Verilog/VHDL (no intention o
being
a higher abstraction level).
Alluded to not "eluded", sorry

.chris

---------------------------------------
Posted through http://www.FPGARelated.com
 
On 2/7/2011 7:49 PM, cfelton wrote:
snip

Thanks for that list Chris - would you be able to comment any more on
the MyHDL project? What's it (MyHDL) like as a design environment?

Cheers,
Martin



MyHDL is a Python module/library for Python. It adds RTL to the Python
language. You can simulate your RTL design in Python, then convert to
Verilog/VHDL for synthesis. You can co-simulate MyHDL (python) with the
Verilog/VHDL.

You would use the same tools (editors, debugger, etc) that you use with any
Python development.

I find MyHDL very useful in developing IP. You might have many flavors of
a particular IP, Python can be useful for managing it all. Things like
managing larger number of registers, in Python you can easily dump all the
registers to CVS, HTML, etc. Without having to have an external tool to
parse etc.

For information see the Why MyHDL http://www.myhdl.org/doku.php/why.
I looked at MyHDL several years ago and gave up when I could
not prove that it worked using my standard modelsim/quartus vhdl flow.
Thanks to your review, I looked again, and it seems this has been fixed:
http://www.myhdl.org/doc/0.7/whatsnew/0.6.html#toVHDL
Maybe I will try to torture it again ;)

-- Mike Treseler
 
In my travels I have found that -- in the US at least -- HDL choice is
very strongly correlated to location: designers on the west coast tend
to use Verilog instead of VHDL, the obverse is the true of folks on the
east coast.

Designers on each coast like to adopt a cosmopolitan air and claim to be
_completely agnostic_ about which language that they'd prefer to use --
at least during the job interview. Then when it comes time to actually
write lines of code, most of them will kick and scream (or at least
quietly hyperventilate) if they don't get to use the language that
they're accustomed to.

So -- where are you from, and what HDL do you use? Have you seen
patterns of language use in your area change in the last decade or so?

I'm particularly interested in hearing from folks outside the US, and
from folks in the US but not on the coasts. Noting whether you're from
a military hardware background or purely civilian is of interest, too.
Location: England

After a number of years doing ASIC/CPLD/FPGA/other design with proprietar
netlists and schematics, my first HDL project was in 1999. Because it wa
military, it had to be VHDL. Since then I haven't done any significan
level of Verilog. I have done civilian, military and "dual-use" project
since then.

I like VHDL - the strong typing, the function/process overloading, th
verbose syntax, the long_and_descriptive signal/process/etc. names.

Because of the available frameworks, it looks like SystemVerilog would b
good for testbenches - perhaps I'll get a chance to use in the next 1
years.

HTH!


---------------------------------------
Posted through http://www.FPGARelated.com
 
Speaking of alternative ways... I should tell you about my latest project, a
bi-"linear" image scaler (up+down) with very wide scaling range (and
tweakable "linear" function for improved image on large scaling factors).

This is mainly developed in OPENOFFICE CALC! :) And I do the simulation in
real-time as I correct formulas. I make sure to write formulas in a logic
style, so it can be translated to vhdl.

I can now put pixels into a cell area to simulate the input image, and I get
an area with the output image (hex values, cause the spreadsheet can't turn
rgb values into cell colors (yet))
I'm about to translate the formulas into vhdl and simulate a bit to verify
that the result matches with the spreadsheet (that means I've translated the
spreadsheet correctly into vhdl). It turns out to work very well so far.

What would do this a lot simpler is if the spreadsheet could do these simple
things:
-Take formula parameters with labels from the row above. Now it can only
work with labels from the current row, unless you want to write some complex
offset functions in there, but that will make the formula unreadable.
-color a cell from RGB values (would give REAL images as test data)

To improve my current situation I could probably make a seperate sheet named
'prev' for the previous variables and access the previous states with a
prev.<label>, but that would require me to pay extra attention when
inserting or deleting columns. I guess I cold easily make some conditional
formatting that would help me detect if the columns were out of sync, so I
may try that to make it even simpler. Maybe some scripting of the openoffice
would help you translate the sheet directly to some other hdl format that
FPGA tools can work on.

Now, what irritates me is that there is no FPGA adapted tool that would work
this way. In my head I can visualize how it should work, and it would be
very similar to a spreadsheet, and you could even get real time information
about fan-outs (non optimized of course), signal type (combinatorical or FF
or other blackboxes). If the days had more hours, I would start coding it
myself :)

[In the end I also plan to do two pixels pr clk for high bandwidth support
:) ]
 
"rickman" <gnuarm@gmail.com> wrote in message
news:81397eb3-d6c4-40f9-8201-ddeafa666456@y12g2000prf.googlegroups.com...
What exactly does the spreadsheet do that you can't do in an HDL
simulation? The array calculations would be very easy to do in a loop
so that you only need to write the formula for the cell once and it
can be iterated over as large an array as you wish. You could even
write the data out to a file to be read into another program that
displays the results as an image.
Here are some.
-openoffice is free. Compare that to any other simulator package. Of course
at this stage I need to verify my stuff with a proper simulator.
-Its much faster to change something and see the result instantly.
-I find the syntax is easier. I don't have to worry about pleasing the vhdl
at design time, and can focus on the function rather than spelling and
formatting. I can worry about that at implementation phase.
-Moving data between apps and files are slowing me down (at design time).
 
On Feb 10, 4:51 am, "Morten Leikvoll" <mleik...@yahoo.nospam> wrote:
Speaking of alternative ways... I should tell you about my latest project, a
bi-"linear" image scaler (up+down) with very wide scaling range (and
tweakable "linear" function for improved image on large scaling factors).

This is mainly developed in OPENOFFICE CALC! :) And I do the simulation in
real-time as I correct formulas. I make sure to write formulas in a logic
style, so it can be translated to vhdl.

I can now put pixels into a cell area to simulate the input image, and I get
an area with the output image (hex values, cause the spreadsheet can't turn
rgb values into cell colors (yet))
I'm about to translate the formulas into vhdl and simulate a bit to verify
that the result matches with the spreadsheet (that means I've translated the
spreadsheet correctly into vhdl). It turns out to work very well so far.

What would do this a lot simpler is if the spreadsheet could do these simple
things:
-Take formula parameters with labels from the row above. Now it can only
work with labels from the current row, unless you want to write some complex
offset functions in there, but that will make the formula unreadable.
-color a cell from RGB values (would give REAL images as test data)

To improve my current situation I could probably make a seperate sheet named
'prev' for the previous variables and access the previous states with a
prev.<label>, but that would require me to pay extra attention when
inserting or deleting columns. I guess I cold easily make some conditional
formatting that would help me detect if the columns were out of sync, so I
may try that to make it even simpler. Maybe some scripting of the openoffice
would help you translate the sheet directly to some other hdl format that
FPGA tools can work on.

Now, what irritates me is that there is no FPGA adapted tool that would work
this way. In my head I can visualize how it should work, and it would be
very similar to a spreadsheet, and you could even get real time information
about fan-outs (non optimized of course), signal type (combinatorical or FF
or other blackboxes). If the days had more hours, I would start coding it
myself :)

[In the end I also plan to do two pixels pr clk for high bandwidth support
:) ]
What exactly does the spreadsheet do that you can't do in an HDL
simulation? The array calculations would be very easy to do in a loop
so that you only need to write the formula for the cell once and it
can be iterated over as large an array as you wish. You could even
write the data out to a file to be read into another program that
displays the results as an image.

Rick
 
On Feb 10, 9:06 am, "Morten Leikvoll" <mleik...@yahoo.nospam> wrote:
"rickman" <gnu...@gmail.com> wrote in message

news:81397eb3-d6c4-40f9-8201-ddeafa666456@y12g2000prf.googlegroups.com...

What exactly does the spreadsheet do that you can't do in an HDL
simulation?  The array calculations would be very easy to do in a loop
so that you only need to write the formula for the cell once and it
can be iterated over as large an array as you wish.  You could even
write the data out to a file to be read into another program that
displays the results as an image.

Here are some.
-openoffice is free. Compare that to any other simulator package. Of course
at this stage I need to verify my stuff with a proper simulator.
FreeHDL -http://freehdl.seul.org/
GHDL - http://ghdl.free.fr/
GPL Cver - http://sourceforge.net/projects/gplcver/
Icarus Verilog - http://en.wikipedia.org/wiki/Icarus_Verilog

There are others...

-Its much faster to change something and see the result instantly.
I guess that depends on what you mean by "instantly". I'm not clear
on what it takes to edit your spreadsheet. I have built large arrays
in spread sheets and found the editing to be the PITA part. As you
mention in your earlier post there are things you have left out
because typing them in would be very tedious. There are times that I
have done parts of the calculations on another worksheet and let the
user interface sheet link to the results. Still, it can get very
messy to edit.


-I find the syntax is easier. I don't have to worry about pleasing the vhdl
at design time, and can focus on the function rather than spelling and
formatting. I can worry about that at implementation phase.
To each his own. I find the spreadsheet language to be very verbose
for anything beyond the fairly simple calcs.


-Moving data between apps and files are slowing me down (at design time).
I'm not sure I understand this. In an HDL you are working 100% in the
final language.

I'm not trying to criticize your method. I just don't understand it
exactly. But I do appreciate the degree of interactivity you can get
using a spread sheet. I often use it for evaluating what if type
things. But I give up when the work gets too complex. Recently I
even entered a cyclical calculation that the spreadsheet complained
about, then went on to recalculate repeatedly in order to find
convergence. I hadn't known you could use it that way. But when the
convergence blows up, I get no info on why... just like any other
simulation.

Rick
 
On 02/04/2011 12:10 PM, Nico Coesel wrote:

IMHO schematics are bad. Most designs are becoming pretty complicated
these days. If you still think in logic then you'll lose oversight at
some point. With VHDL I have learned not to think in logic but think
in functions. Parallel functions to be exact.

Yes, I have come around to this view. I started out with CPLDs like the
Xilinx 9500 series, and a schematic could express what I was doing there
quite succinctly. I was happy with these, but hated the Ghastly Aldec
schematic editor that Xilinx Foundation used at the time. I worked up a
way to make structural VHDL from my preferred schematic/PCB editor
(Protel99 SE). this required hand editing for some small things, but
mostly worked, but the library parts (4 bit counter and the like)
provided by Protel were a GHASTLY hack-job and full of errors of the
sort where a needed junction (wire dot) was left out, so only one FF of
the 4 was clocked. So, I had to find and fix all those goofs.
But, as the stuff I did in FPGAs grew in complexity, schematics got more
and more unwieldy. Migration away from Aldec to several generations of
incompatible schematic packages also made things more difficult.

I have a number of legacy designs that I continue to make improvements
on and migrate to newer FPGA families. I have converted many of the
sub-sheets to behavioral VHDL, and now do anything new in VHDL. I am in
St. Louis, MO, so midway between the coasts. I pretty much do all this
work in isolation, so there's no corporate cultural bias here. VHDL was
more "open" back when, meaning more books were available at the library
giving design strategy help, so that's what I learned to use. I can see
that Verilog is a LOT better if you are doing a lot of arithmetic
operations mixed with logical stuff, but the syntax still seems a bit
strange to me. I have done some stuff that had arithmetic and
addition/subtraction of values in VHDL, and once I got the basic syntax
for type conversion, it wasn't bad at all. I kind of worry when
languages are making a lot of assumptions about what I'm trying to do, I
don't mind having to be a bit specific.

Jon
 

Welcome to EDABoard.com

Sponsor

Back
Top