EDK : FSL macros defined by Xilinx are wrong

philipnchill@gmail.com wrote:
> VHDL is leads to verbose designs which are slow to write and hard to visualise.

I used to write structural verilog, but for a recent project,
structural VHDL. I do find it wordier, but not all that much
harder to read or write.

For people who started working on logic in the 7400 TTL days, it should
be possible to visualize VHDL in a similar way to TTL gates
(especially the MSI, such as counters, encoders, and such).

I did some AHDL some years ago, but don't remember much about it now.

-- glen
 
El miércoles, 20 de mayo de 2015, 9:01:41 (UTC-3), thomas....@gmail.com escribió:
I think you answered a question from the previous century...

Thomas

Previous millenium!!!

As for the matter itself, if you are in the world of programming for a long time, you get a sense of how much it takes to make a code/software transition. Programming/coding really is not what it is portrayed to be, in the sense everyone thinks transitions from older things to newer things are smooth and better because newer things have compatibility mode and are better written, this is still not tha case (and it is difficult to say if it ever will be so). Think about how much you can achieve with C89, newer languages have more capabilities, but don't have the same kind of support, and since it is a low level language with enough effort (sometimes TOO much) you can achieve the same functionality than high level languages.

As for specifically VHDL versus AHDL, AHDL has no SERIOUS advantage over VHDL, only easier coding, and sometimes ease of coding translate into more errors (real errors in the design not compiler errors, nobody cares about that, with enough experience you get over all compiler errors and your design might still not work). This kind of ease of coding can only be an advantage to someone that has very little experience (hence not relevant enough for everyone to make a transition to that language).

For me I have also my own opinion between VHDL and Verilog, I find VHDL to be a far more capable language than Verilog bar a few respects (until now in VHDL we would have to pre-declare arrays of SLVs), it only takes more time to get used to (SystemVerilog is coming close to VHDL in some respects though, and in other surpassing it by far, even in VHDL-2008)

Cheers, and choose the language that gives less design bugs.
 
On 5/19/2015 3:24 PM, philipnchill@gmail.com wrote:
VHDL is leads to verbose designs which are slow to write and hard to visualise.
AHDL is elegant and specifically designed fpga type architectures.

Religious wars never have a winner. I pick Kirk over Pickard.

JJS
 
On Wed, 20 May 2015 05:01:37 -0700 (PDT)
thomas.entner99@gmail.com wrote:

I think you answered a question from the
previous century...

Thomas

I wasn't going to say that, but perhaps Philip
should look in the opposite direction, to the
future.

MyHDL encourages test-driven development, has
a short edit-test cycle, using just one tool,
and is supported by the power of python.

It does not have the verbosity of VHDL, nor the
subtle problems of Verilog, but exports either.

Jan Coombs.
 
On Saturday, 8 May 2010 08:32:48 UTC-3, charly wrote:
Discount Wholesale Affliction Jeans <free shipping paypal payment
Discount Wholesale AK Jeans ( http://www.supertradeonline06.com/ )
Discount Wholesale Armani Jeans
Discount Wholesale Artful Dodger Jeans <free shipping paypal payment
Discount Wholesale BAPE Jeans
Discount Wholesale BBC Jeans ( http://www.supertradeonline06.com/ )
Discount Wholesale Black Label Jeans
Discount Wholesale Cavalli Jeans
Discount Wholesale Christian Audigier Jeans
Discount Wholesale Coogi Jeans
Discount Wholesale Crown Holder Jeans ( http://www.supertradeonline06.com/
)
Discount Wholesale D&G Jeans
Discount Wholesale Diesel Jeans
Discount Wholesale ECKO Jeans ( http://www.supertradeonline06.com/ )
Discount Wholesale ED Hardy Jeans
Discount Wholesale Evisu Jeans
Discount Wholesale G-STAR Jeans <free shipping paypal payment
Discount Wholesale GUCCI Jeans
Discount Wholesale Iceberg Jeans
Discount Wholesale Kanji Jeans ( http://www.supertradeonline06.com/ )
Discount Wholesale Laguna Beach Jeans
Discount Wholesale Levi s Jeans
Discount Wholesale LRG Jeans <free shipping paypal payment
Discount Wholesale LV Jeans
Discount Wholesale Prada Jeans ( http://www.supertradeonline06.com/ )
Discount Wholesale RMC Jeans
Discount Wholesale Roca Wear Jeans <free shipping paypal payment
Discount Wholesale Rock&Republic Jeans
Discount Wholesale True Religion Jeans <free shipping paypal payment
Discount Wholesale Versace Jeans
Discount Wholesale ZEN Jeans ( http://www.supertradeonline06.com/ )
 
John,

you might want to look at the 4DSP products (http://www.4dsp.com). They have both FPGA PCIe boards and DAC FMC daughter boards.

Regards,

Guy Eschemann
FPGA Consultant
http://noasic.com


On Tuesday, June 9, 2015 at 12:32:35 AM UTC+2, John Larkin wrote:
I got a call from a really nice guy who has a tiny company in the
Bahamas. Our gear is too expensive for his application, but it could
be done with a PCIe PC-plugin board that has an FPGA and a fast DAC.
It would need analog bandwidth in the 30 MHz range, maybe 100M
samples/sec or so. He would need help to program the FPGA, since the
signal set that he needs to generate is kind of weird, but not
actually super complex.

So, does anybody know of an existing board that would work? I'd expect
that lots of people make stuff like this and are used to helping
customers customize them. We're talking tens of systems here, not
enough for a custom board design.

I'll do a little googling myself, but I thought I'd ask.


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
On 7/29/2015 8:57 PM, foxaudioresearch@gmail.com wrote:
This is a very nice testimonial to some Forth hacking.

http://hackaday.com/2015/07/28/open-source-fpga-toolchain-builds-cpu/

Youtube video is here https://www.youtube.com/watch?v=rdLgLCIDSk0

Not intending to rain on anyone's parade, just asking a question. Why
does the forth community care if the FPGA development tools are open
source? The commercial tools don't really restrict the design of a CPU
in any way does it? The only real issue I've ever had with the
commercial FPGA tools is the license which is a PITA to deal with once a
year when it has to be renewed or when moving to a new machine.

Is it more a conceptual thing than a practical thing?

BTW, there is a very small board with the smallest FPGA on it which fits
the pinout of a 8 pin DIP version of a CPU. Here is the post I made
about this in the FPGA related groups with a couple more links...

I am very impressed. I was reading about Antti's incredibly tiny FPGA
project board and saw a mention of a FOSS FPGA toolchain. Not just the
compiler, but the entire bitstream generation!

http://hackaday.com/2015/07/03/hackaday-prize-entry-they-make-fpgas-that-small/


Several people have built on each other's work to provide "a fully open
source Verilog to bitstream development tool chain for the Lattice
iCE40LP with support for more devices in the works."

http://hackaday.com/2015/05/29/an-open-source-toolchain-for-ice40-fpgas/

https://github.com/cseed/arachne-pnr

I haven't tried any of it yet, but I am very impressed that they are
reverse engineering the devices so that they can generate bit streams
and not rely on the vendor.

I found another link relating to the tools called "IceStorm".

http://www.clifford.at/icestorm/

Cross-posting to the FPGA group to get some cross-pollination.

--

Rick
 
Am Mittwoch, 5. August 2015 23:30:58 UTC+2 schrieb Philipp Klaus Krause:
On 05.08.2015 01:46, rickman wrote:
On 8/4/2015 7:05 PM, Aleksandar Kuktin wrote:

Hackability. If you have an itch, you can scratch it yourself with FOSS
tools. If you discover a bug, you can fix it yourself. If you want to
repurpose, optimize or otherwise change the tool, you can do it with
FOSS.

That's great. But only important to a small few. I use tools to get
work done. I have zero interest in digging into the code of the tools
without a real need. I have not found any bugs in the vendor's tools
that would make me want to spend weeks learning how they work in the,
most likely, vain hope that I could fix them.

I think FOSS is great and I am very happy to see that finally happen in
an end to end toolchain for an FPGA. But it is statements like this
that I don't understand, "An open-source toolchain for the IGLOO parts
could be an unusually powerful tool in the hands of a creative
designer", or this "Because open source tools allow exploration of
techniques which are restricted using regular tools."

Not trying to give anyone grief. I'd just like to understand what
people expect to happen with FOSS that isn't happening with the vendor's
closed, but free tools.


Same thing that's happening with compilers all the time.

Just a personal example:
A log time ago I decided to make a few games for the ColecoVision
console. The ColecoVision uses a Z80, and at the tie all the other
homebrew game developers used an old DOS eval version of IAR within
Windows. I used the free sdcc compiler. Not always being happy with the
generated code I started improving it, ad later became the maintainer of
the Z80 port.
A few years ago I joined the group for theory of computer science at the
univesity in Frankfurt as a PhD student. I found that I could apply
graph structure theory in compiler construction. This resulted in some
quite unusual optimizations in SDCC currently not found in any other
compiler.

Philipp

I think C-compilers are the piece of software were open source works best, as there is a big user base, many of them are skilled programmers. So there is both the skill and motivation to improve the product.

For software not targeted to programmers, the user base must be very large to have sufficient contributors, IMHO.

For FPGA design, the user base is much smaller than for a C compiler. How much of them would really use the open source alternatives when there are very advanced free vendor tools? And how much of them are really skilled software gurus? And have enough spare time? Of course you would find some students which are contributing (e.g. for their thesis), but I doubt that it will be enough to get a competitve product and to maintain it. New devices should be supported with short delay, otherwise the tool would not be very useful.

Of course it could be a good playfield for students, to have a reference for future "real" jobs in the EDA field, but then the tool would not aim to be really used by the average FPGA designer...

BTW: Thanks for your contribution to SDCC, we have ported it to our ERIC5 soft-core many years ago. We also found quite some bugs at that time...

Thomas
 
thomas.entner99@gmail.com wrote:
> Am Mittwoch, 5. August 2015 23:30:58 UTC+2 schrieb Philipp Klaus Krause:

(snip on open source hardware design tools)

Same thing that's happening with compilers all the time.

Just a personal example:
A log time ago I decided to make a few games for the ColecoVision
console. The ColecoVision uses a Z80, and at the tie all the other
homebrew game developers used an old DOS eval version of IAR within
Windows. I used the free sdcc compiler. Not always being happy with the
generated code I started improving it, ad later became the maintainer of
the Z80 port.

(snip)
I think C-compilers are the piece of software were open source
works best, as there is a big user base, many of them are
skilled programmers. So there is both the skill and motivation
to improve the product.

For software not targeted to programmers, the user base must be
very large to have sufficient contributors, IMHO.

I wonder what one would have said before gcc?

It used to be that unix always came with a C compiler, as one
was required to sysgen a kernel. At one point, Sun changed to
a bundled minimal C compiler, and charge for a better one.
That opened a door for gcc that might otherwise not have been there.

For FPGA design, the user base is much smaller than for a
C compiler. How much of them would really use the open source
alternatives when there are very advanced free vendor tools?
And how much of them are really skilled software gurus?
And have enough spare time? Of course you would find some
students which are contributing (e.g. for their thesis),
but I doubt that it will be enough to get a competitve
product and to maintain it. New devices should be supported
with short delay, otherwise the tool would not be very useful.

Again, consider before gcc. I suspect that there are many times
more C programmers now than in the 1980s, yet there were enough
to cause gcc to exist.

Of course it could be a good playfield for students,
to have a reference for future "real" jobs in the EDA field,
but then the tool would not aim to be really used by the
average FPGA designer...

People use gcc because it works well, and it works well because
people use it, and want it to work well.

But one reason we have free HDL tools (from Xilinx and Altera)
now is related to the competition between them. With only
one FPGA company, there would be no need for competition,
tools could be expensive, and there could be a significant
advantage to FOSS tools.

BTW: Thanks for your contribution to SDCC, we have ported
it to our ERIC5 soft-core many years ago. We also found
quite some bugs at that time...

-- glen
 
rickman <gnuarm@gmail.com> wrote:

(snip, I wrote)
But one reason we have free HDL tools (from Xilinx and Altera)
now is related to the competition between them. With only
one FPGA company, there would be no need for competition,
tools could be expensive, and there could be a significant
advantage to FOSS tools.

I'm not sure the price of the tools is so much related to the
competition between the companies. Hypothesizing only one FPGA company
is not very realistic and it is certainly far down my list of concerns.
I expect the price of tools is much more related to promoting the
"exploration" of the use of FPGAs. If you even have to spend $100, that
makes for a barrier to anyone wanting to start testing the tools. I ran
into this myself in jobs where I wanted to try something, but couldn't
get one dime spent.

OK, but as I understand it Altera started distributing free versions,
and Xilinx followed, presumably for competitive reasons.

As you note, the free versions allowed exploration.

If one hadn't done it first, the other might not have.

I can always find a little free time to spend on
ideas, but spending money almost always goes through a review of some
sort where they want you to show why and the "why" is what you want to
determine.

The way free market is supposed to work.


-- glen
 
On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote:
Not trying to give anyone grief. I'd just like to understand what
people expect to happen with FOSS that isn't happening with the vendor's
closed, but free tools.

Here's one example: during development, I'm targeting an FPGA that's several times larger than it needs to be, and the design has plenty of timing margin. So why in the name of Woz do I have to cool my heels for 10 minutes every time I tweak a single line of Verilog?

If the tools were subject to community development, they probably wouldn't waste enormous amounts of time generating 99.9% of the same logic as last time. Incremental compilation and linking is ubiquitous in the software world, but as usual the FPGA tools are decades behind. That's the sort of improvement that could be expected with an open toolchain.

It's as if Intel had insisted on keeping the x86 ISA closed, and you couldn't get a C compiler or even an assembler from anyone else. How much farther behind would we be? Well, there's your answer.

-- john, KE5FX
 
On 8/5/2015 7:46 PM, glen herrmannsfeldt wrote:
But one reason we have free HDL tools (from Xilinx and Altera)
now is related to the competition between them. With only
one FPGA company, there would be no need for competition,
tools could be expensive, and there could be a significant
advantage to FOSS tools.

I'm not sure the price of the tools is so much related to the
competition between the companies. Hypothesizing only one FPGA company
is not very realistic and it is certainly far down my list of concerns.
I expect the price of tools is much more related to promoting the
"exploration" of the use of FPGAs. If you even have to spend $100, that
makes for a barrier to anyone wanting to start testing the tools. I ran
into this myself in jobs where I wanted to try something, but couldn't
get one dime spent. I can always find a little free time to spend on
ideas, but spending money almost always goes through a review of some
sort where they want you to show why and the "why" is what you want to
determine.

--

Rick
 
On 8/5/2015 9:13 PM, glen herrmannsfeldt wrote:
rickman <gnuarm@gmail.com> wrote:

(snip, I wrote)
But one reason we have free HDL tools (from Xilinx and Altera)
now is related to the competition between them. With only
one FPGA company, there would be no need for competition,
tools could be expensive, and there could be a significant
advantage to FOSS tools.

I'm not sure the price of the tools is so much related to the
competition between the companies. Hypothesizing only one FPGA company
is not very realistic and it is certainly far down my list of concerns.
I expect the price of tools is much more related to promoting the
"exploration" of the use of FPGAs. If you even have to spend $100, that
makes for a barrier to anyone wanting to start testing the tools. I ran
into this myself in jobs where I wanted to try something, but couldn't
get one dime spent.

OK, but as I understand it Altera started distributing free versions,
and Xilinx followed, presumably for competitive reasons.

As you note, the free versions allowed exploration.

If one hadn't done it first, the other might not have.

Perhaps, or it was just a matter of time. Clearly the business model
works and I think it was inevitable. MCU vendors understand the
importance and pay for tools to give away. Why not give away $100 tool
or even a $1000 tool if it will get you many thousands of dollars in
sales? It's the tool vendors who I expect have the bigger problem with
this model.

For FPGAs the funny part is I was told a long time ago that Xilinx
spends more on the software than they do designing the hardware. The
guy said they were a software company making money selling the hardware
they support.


I can always find a little free time to spend on
ideas, but spending money almost always goes through a review of some
sort where they want you to show why and the "why" is what you want to
determine.

The way free market is supposed to work.

Free market? I'm talking about company internal management. It is so
easy to track every penny, but hard to track your time to the same
degree. Often this is penny wise, pound foolish, but that's the way it
is. I'm clear of that now by working for myself, but I still am happier
to spend my time than my money, lol.

--

Rick
 
On 8/5/2015 9:48 PM, John Miles wrote:
On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote:
Not trying to give anyone grief. I'd just like to understand what
people expect to happen with FOSS that isn't happening with the
vendor's closed, but free tools.


Here's one example: during development, I'm targeting an FPGA that's
several times larger than it needs to be, and the design has plenty
of timing margin. So why in the name of Woz do I have to cool my
heels for 10 minutes every time I tweak a single line of Verilog?

If the tools were subject to community development, they probably
wouldn't waste enormous amounts of time generating 99.9% of the same
logic as last time. Incremental compilation and linking is
ubiquitous in the software world, but as usual the FPGA tools are
decades behind. That's the sort of improvement that could be
expected with an open toolchain.

It's as if Intel had insisted on keeping the x86 ISA closed, and you
couldn't get a C compiler or even an assembler from anyone else. How
much farther behind would we be? Well, there's your answer.

Don't know about Intel, but I seem to recall that Xilinx tools have
incremental compilation. Maybe they have dropped that. They dropped a
number of things over the years such as modular compilation which at one
point a Xilinx representative swore to me was in the works for the lower
cost Spartan chips and would be out by year end. I think that was over
a decade ago.

Even so, there are already FOSS HDL compilers available. Do any of them
offer incremental compilation?

I believe the P&R tools can work incrementally, but again, maybe that is
not available anymore. You used to be able to retain a portion of the
routing and keep working on the rest over and over. I think the idea
was to let you have a lot of control over a small part of the design and
then let the tool handle the rest on autopilot.

--

Rick
 
On Wednesday, August 5, 2015 at 8:37:41 PM UTC-7, rickman wrote:
I believe the P&R tools can work incrementally, but again, maybe that is
not available anymore. You used to be able to retain a portion of the
routing and keep working on the rest over and over. I think the idea
was to let you have a lot of control over a small part of the design and
then let the tool handle the rest on autopilot.

If there's a way to do it in the general case I haven't found it. :( I wouldn't be surprised if they could leverage *some* previous output files, but there are obviously numerous phases of the synthesis process that each take a long time, and they would all have to play ball.

Mostly what I want is an option to allocate extra logic resources beyond what's needed for a given build and use it to implement incremental changes to the design. No P&R time should be necessary in about 4 out of 5 builds, given the way my edit-compile-test cycles tend to work. I'm pretty sure there's no way to tell it to do that. It would be nice to be wrong.

-- john, KE5FX
 
On 8/6/2015 1:54 AM, John Miles wrote:
On Wednesday, August 5, 2015 at 8:37:41 PM UTC-7, rickman wrote:
I believe the P&R tools can work incrementally, but again, maybe
that is not available anymore. You used to be able to retain a
portion of the routing and keep working on the rest over and over.
I think the idea was to let you have a lot of control over a small
part of the design and then let the tool handle the rest on
autopilot.


If there's a way to do it in the general case I haven't found it. :(
I wouldn't be surprised if they could leverage *some* previous output
files, but there are obviously numerous phases of the synthesis
process that each take a long time, and they would all have to play
ball.

Mostly what I want is an option to allocate extra logic resources
beyond what's needed for a given build and use it to implement
incremental changes to the design. No P&R time should be necessary
in about 4 out of 5 builds, given the way my edit-compile-test cycles
tend to work. I'm pretty sure there's no way to tell it to do that.
It would be nice to be wrong.

I'm not sure what that means, "allocate extra logic resources" and use
them with no P&R time...? Are you using the Xilinx tools?

--

Rick
 
On 06/08/15 01:46, glen herrmannsfeldt wrote:

People use gcc because it works well, and it works well because
people use it, and want it to work well.

One key difference here is that gcc is written in C (and now some C++),
and it's main users program in C and C++. Although compiler
design/coding is a different sort of programming than most of gcc's
users do, there is still a certain overlap and familiarity - the barrier
for going from user to contributor is smaller with gcc than it would be
for a graphics artist using GIMP or a writer using LibreOffice, or an
FPGA designer using these new tools.

The key challenge for open source projects like this is to develop a
community of people who understand the use of the tools, and understand
(and can contribute to) the coding. Very often these are made by one or
two people - university theses are common - and the project dies away
when the original developers move on. To be serious contenders for real
use, you need a bigger base of active developers and enthusiastic users
who help with the non-development work (documentation, examples,
testing, support on mailing lists) - MyHDL is an example of this in the
programmable logic world.
 
On 06.08.2015 00:52, thomas.entner99@gmail.com wrote:

For FPGA design, the user base is much smaller than for a C compiler.
How much of them would really use the open source alternatives when
there are very advanced free vendor tools? And how much of them are
really skilled software gurus? And have enough spare time? Of course
you would find some students which are contributing (e.g. for their
thesis), but I doubt that it will be enough to get a competitve
product and to maintain it. New devices should be supported with
short delay, otherwise the tool would not be very useful.

I don't see the big difference to compilers targeting microcontrollers here.
There are plenty of older FPGA types, such as the Xilinx XC9500 still in
use. A free toolchain for them would be useful, and having advanced
optimizations would be benefitial there as well.
On the microcontroller side, SDCC also targets mostly older
architectures, and a few newer ones, such as Freescale S08 and
STMMicroelectronics STM8.
You don't need every user to become a developer. A few are enough.

Philipp
 
On Thursday, August 6, 2015 at 9:48:11 AM UTC+8, John Miles wrote:
On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote:
Not trying to give anyone grief. I'd just like to understand what
people expect to happen with FOSS that isn't happening with the vendor's
closed, but free tools.


Here's one example: during development, I'm targeting an FPGA that's several times larger than it needs to be, and the design has plenty of timing margin. So why in the name of Woz do I have to cool my heels for 10 minutes every time I tweak a single line of Verilog?

If the tools were subject to community development, they probably wouldn't waste enormous amounts of time generating 99.9% of the same logic as last time. Incremental compilation and linking is ubiquitous in the software world, but as usual the FPGA tools are decades behind. That's the sort of improvement that could be expected with an open toolchain.

It's as if Intel had insisted on keeping the x86 ISA closed, and you couldn't get a C compiler or even an assembler from anyone else. How much farther behind would we be? Well, there's your answer.

-- john, KE5FX

Incremental synthesis/compilation is supported by both Xilinx (ISE and Vivado) and Altera (Quartus) tools, even in the latest versions. One needs to use the appropriate switch/options. Of course, their definition of incremental compile/synthesis may not match exactly with yours. They tend to support more at the block level using partitions etc.
 

Welcome to EDABoard.com

Sponsor

Back
Top