designing a fpga

K

kristoff

Guest
Hi all,

A couple of weeks ago, I was watching the talk of Wolf Clifford on his
opensource fpga flow at ccc.
(https://www.youtube.com/watch?v=SOn0g3k0FlE)


At the end, he mentions designing an open-source fpga and the replies he
got when he mentioned the idea to hardware-companies. Appart from the
question about the usefullness or economic viability of the idea itself
(1), it did get me thinking.


Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?



Cheerio! Kr. Bonne.
 
On 2/24/2017 2:32 AM, kristoff wrote:
Hi all,

A couple of weeks ago, I was watching the talk of Wolf Clifford on his
opensource fpga flow at ccc.
(https://www.youtube.com/watch?v=SOn0g3k0FlE)


At the end, he mentions designing an open-source fpga and the replies he
got when he mentioned the idea to hardware-companies. Appart from the
question about the usefullness or economic viability of the idea itself
(1), it did get me thinking.


Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?

Nearly two decades ago (well, 15 years anyway) folks from Xilinx said
they spent more on software development than developing the chips.

--

Rick C
 
In 2004 ST started the most serious Open FPGA project I am aware of:

http://web.archive.org/web/20041208022906/http://www.gospl.org/fpl/static/aboutgospl.jsp

Sadly they gave up in early 2005.

-- Jecel
 
On 02/24/2017 08:32 AM, kristoff wrote:
Hi all,

A couple of weeks ago, I was watching the talk of Wolf Clifford on his opensource fpga flow at ccc.
(https://www.youtube.com/watch?v=SOn0g3k0FlE)


At the end, he mentions designing an open-source fpga and the replies he got when he mentioned the idea to hardware-companies. Appart from the question about the usefullness or economic viability of the idea itself (1), it did get me thinking.


Question, can I conclude from his remark that -if a hardware companie would start out with designing a fpga- that the problem is more the "software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?



Cheerio! Kr. Bonne.

Hodgin posted this link a while ago:
https://www.youtube.com/watch?v=1oG-3XWLgog
 
rickman wrote:

On 2/24/2017 2:32 AM, kristoff wrote:
Hi all,

A couple of weeks ago, I was watching the talk of Wolf Clifford on his
opensource fpga flow at ccc.
(https://www.youtube.com/watch?v=SOn0g3k0FlE)


At the end, he mentions designing an open-source fpga and the replies he
got when he mentioned the idea to hardware-companies. Appart from the
question about the usefullness or economic viability of the idea itself
(1), it did get me thinking.


Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?

Nearly two decades ago (well, 15 years anyway) folks from Xilinx said
they spent more on software development than developing the chips.
It is easy to believe this is true. The FPGA needs a fair bit of careful
design and testing, but the structure is quite simple. You need to make
sure the LUTs are glitchless and that you know the timings. That is about
it.

THEN, you get to the software side. You need to be able to take in all the
suboptimal and blatantly bad HDL that users will throw at it, and at least
give coherent error messages and not crash. There are probably so MANY ways
to write bad HDL that has unintended side effects, races and logic hazards.
While the FPGA is a massively repeated set of very simple identical cells,
the software has to treat it as thousands of different components once the
LUT patterns have been loaded. I'm GLAD somebody else is doing all this
work!

To make your own FPGA, probably the BIGGEST minefield is the patent arena.
There must be thousands of current patents on FPGAs and FPGA-like devices
and tons of old prior art that could make patenting anything you design
problematic. Even if you had no intention of filing for a patent, you'd
have to design very carefully so as not to step on one of the "big boys"
patents. There is also lots of prtections files on software IP that you'd
have to avoid.

Jon
 
On 2/24/2017 11:35 AM, Jon Elson wrote:
rickman wrote:

On 2/24/2017 2:32 AM, kristoff wrote:
Hi all,

A couple of weeks ago, I was watching the talk of Wolf Clifford on his
opensource fpga flow at ccc.
(https://www.youtube.com/watch?v=SOn0g3k0FlE)


At the end, he mentions designing an open-source fpga and the replies he
got when he mentioned the idea to hardware-companies. Appart from the
question about the usefullness or economic viability of the idea itself
(1), it did get me thinking.


Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?

Nearly two decades ago (well, 15 years anyway) folks from Xilinx said
they spent more on software development than developing the chips.

It is easy to believe this is true. The FPGA needs a fair bit of careful
design and testing, but the structure is quite simple. You need to make
sure the LUTs are glitchless and that you know the timings. That is about
it.

I think you are oversimplifying the design of an FPGA by quite a large
margin. I believe the most important part of FPGAs is the routing and
overall architecture. I am sure they put tons of effort into optimizing
every aspect of the logic as well as the routing for timing and power
consumption, the two most important parameters of FPGAs.

The design of all the various special functions take no small amount of
effort too, the clock blocks are a good example. Then there are
multipliers and memory, all of which must be optimized for the process.
In fact, my understanding is that the FPGA vendors are a large
contributor to the development of the processes used at the foundries.


THEN, you get to the software side. You need to be able to take in all the
suboptimal and blatantly bad HDL that users will throw at it, and at least
give coherent error messages and not crash. There are probably so MANY ways
to write bad HDL that has unintended side effects, races and logic hazards.
While the FPGA is a massively repeated set of very simple identical cells,
the software has to treat it as thousands of different components once the
LUT patterns have been loaded. I'm GLAD somebody else is doing all this
work!

I don't think writing code to read text and not crash is actually all
that hard. The tool vendors don't care about logic hazards, that is the
domain of the designer.


To make your own FPGA, probably the BIGGEST minefield is the patent arena.
There must be thousands of current patents on FPGAs and FPGA-like devices
and tons of old prior art that could make patenting anything you design
problematic. Even if you had no intention of filing for a patent, you'd
have to design very carefully so as not to step on one of the "big boys"
patents. There is also lots of prtections files on software IP that you'd
have to avoid.

Certainly if you wish to make a state of the art FPGA it would involve
dodging a great many patents. But to design *a* FPGA would not be so
hard. In fact expired patents would be your pool of resources to draw
from. The basic LUT used as logic and memory are out of patent along
with everything used in devices like the XC4000 series. If I were
designing a chip and wanted to include FPGA fabric, I could do worse
than to duplicate the functionality of that device.

--

Rick C
 
rickman wrote:


I think you are oversimplifying the design of an FPGA by quite a large
margin. I believe the most important part of FPGAs is the routing and
overall architecture. I am sure they put tons of effort into optimizing
every aspect of the logic as well as the routing for timing and power
consumption, the two most important parameters of FPGAs.
OK, yes, I WAS oversimplifying. My point (badly stated) was that the FPGA
designers hold all the cards, they fully specify the LUTs, the routing
matrix, the IOBs, etc. The SOFTWARE, however, has to deal with all the
pathological and just totally unexpected things people will try to do with
an FPGA. How about designing your own ring oscillator?
The design of all the various special functions take no small amount of
effort too, the clock blocks are a good example. Then there are
multipliers and memory, all of which must be optimized for the process.
In fact, my understanding is that the FPGA vendors are a large
contributor to the development of the processes used at the foundries.
Yes, I can believe that, too. They really push the boundaries of what can
be done in a process.
I don't think writing code to read text and not crash is actually all
that hard. The tool vendors don't care about logic hazards, that is the
domain of the designer.
OK, maybe not "crash", but produce unintelligable error messages, or just
totally screwy results, with NO error messages. Yes, now I must admit, some
of my legacy designs that have been dragged along from 5V Spartan to Spartan
2E to Spartan 3A still have a bunch of crud left over from their old history
and mediocre hacking. But, I've had a few situations where ISE didn't like
what I'd given it, and had to just recompose some portion of the logic. I
never understood what was wrong with my VHDL, but changing the way I'd
written the equations just slightly made it work. Fortunately, I have had
very few of these situations, and for the most part ISE works amazingly
well, and I'm NOT complaining. I'm just aware that there are so many ways
to structure HDLs, and so many things one can do with it, that it seems very
complicated to make it all work.

Jon
 
On 2/24/2017 11:30 PM, Jon Elson wrote:
rickman wrote:



I think you are oversimplifying the design of an FPGA by quite a large
margin. I believe the most important part of FPGAs is the routing and
overall architecture. I am sure they put tons of effort into optimizing
every aspect of the logic as well as the routing for timing and power
consumption, the two most important parameters of FPGAs.

OK, yes, I WAS oversimplifying. My point (badly stated) was that the FPGA
designers hold all the cards, they fully specify the LUTs, the routing
matrix, the IOBs, etc. The SOFTWARE, however, has to deal with all the
pathological and just totally unexpected things people will try to do with
an FPGA. How about designing your own ring oscillator?

I'm not sure how to even do that in an HDL. I suppose you can have a
second input to each inverter and set bits in a register to enable it.
But it would still have an combinatorial feedback path which would flag
an error in the timing analyzer unless you except that path. Where do
you see the problem?

The tools don't cope with a lot of crazy stuff. If the inputs are too
wacky, they just give an error message.


The design of all the various special functions take no small amount of
effort too, the clock blocks are a good example. Then there are
multipliers and memory, all of which must be optimized for the process.
In fact, my understanding is that the FPGA vendors are a large
contributor to the development of the processes used at the foundries.

Yes, I can believe that, too. They really push the boundaries of what can
be done in a process.



I don't think writing code to read text and not crash is actually all
that hard. The tool vendors don't care about logic hazards, that is the
domain of the designer.


OK, maybe not "crash", but produce unintelligable error messages, or just
totally screwy results, with NO error messages. Yes, now I must admit, some
of my legacy designs that have been dragged along from 5V Spartan to Spartan
2E to Spartan 3A still have a bunch of crud left over from their old history
and mediocre hacking. But, I've had a few situations where ISE didn't like
what I'd given it, and had to just recompose some portion of the logic. I
never understood what was wrong with my VHDL, but changing the way I'd
written the equations just slightly made it work. Fortunately, I have had
very few of these situations, and for the most part ISE works amazingly
well, and I'm NOT complaining. I'm just aware that there are so many ways
to structure HDLs, and so many things one can do with it, that it seems very
complicated to make it all work.

It's all rule based. As long as you follow the rules it will
synthesize. I remember my first VHDL design. I used some of the
features that seemed useful and no one told me not to (it was an Orcad
tool believe it or not). It was so terrible we switched to the Xilinx
tool (I don't recall the origin of that tool). I was using '-' for
don't cares in comparisons. Then we had a tool update and Xilinx
switched the synthesis engine on us. The don't cares didn't work
anymore as well as many other issues. Back in those days I think the
vendors tried to ignore some aspects of the VHDL standard and let you
get away with some things and not others and of course, each vendor was
different. So I wrote my code three times for that project.

--

Rick C
 
I've been considering designing my own fpga to go with my Logician
tool. I would call it Si-Block ("sigh-block," or SiB for short). It would
allow unlimited logic with a decreasing performance level the more
complex it got, running on a saturating clock that fires maximally
at frequency, but otherwise only when each cycle completes fully.
It would be inexpensive, with full debug abilities, and room to
expand.

Stack:

0: [SiB hardware]
1: [Logician, simulation + SiB compiler]
2: [HLL Compilers, able to also output to Logician]
3: [IDE / text editors, express in some language]
4: [Human ideas]

Thank you,
Rick C. Hodgin
 
Hi Jecel, Rickman, Jon, Johann, ... all




On 24-02-17 18:23, Jecel wrote:
In 2004 ST started the most serious Open FPGA project I am aware of:
http://web.archive.org/web/20041208022906/http://www.gospl.org/fpl/static/aboutgospl.jsp
Sadly they gave up in early 2005.

Interesting.


Now I do get the impression that open-source hardware (and related
issues) is something that just now has started to become of age.
(perhaps opensparc was also years to soon).


Appart from the icestorm tool, I now found this project: Trollstigen
https://www.youtube.com/watch?v=zMZqzXfjOko

Also here:
https://www.youtube.com/watch?v=04SWPT4d9Ls


I think it's interesting in two ways:

- it's really a open-source fpga design so it proves in principe that
this can be done.

- it seams to be based on a different flow: VTR (verilog to routing,
https://verilogtorouting.org/) on which they added their own
bitstream-generator

Now, i am far being an expert in fpga expert (hence all my questions
here :) ) but from what I understand about it, this means there are two
options on the table:
- completely open-source: open-source fpga hardware + open-source tools
- closed-source fpga hardware + open-source tools (except for the
bitstream generator, which would then be closed source).

I am correct, the actual knowledge of the internals of the FPGA is in
the bitstream generator, so if a company does not want to expose that
information, it would be enough to use a closed-source tool for that.

Well free to correct me if I am wrong.


> -- Jecel

Cheerio! Kr. Bonne.
 
On 2/25/2017 4:39 AM, kristoff wrote:
Hi Jecel, Rickman, Jon, Johann, ... all




On 24-02-17 18:23, Jecel wrote:
In 2004 ST started the most serious Open FPGA project I am aware of:
http://web.archive.org/web/20041208022906/http://www.gospl.org/fpl/static/aboutgospl.jsp

Sadly they gave up in early 2005.

Interesting.


Now I do get the impression that open-source hardware (and related
issues) is something that just now has started to become of age.
(perhaps opensparc was also years to soon).


Appart from the icestorm tool, I now found this project: Trollstigen
https://www.youtube.com/watch?v=zMZqzXfjOko

Also here:
https://www.youtube.com/watch?v=04SWPT4d9Ls


I think it's interesting in two ways:

- it's really a open-source fpga design so it proves in principe that
this can be done.

- it seams to be based on a different flow: VTR (verilog to routing,
https://verilogtorouting.org/) on which they added their own
bitstream-generator

Now, i am far being an expert in fpga expert (hence all my questions
here :) ) but from what I understand about it, this means there are two
options on the table:
- completely open-source: open-source fpga hardware + open-source tools
- closed-source fpga hardware + open-source tools (except for the
bitstream generator, which would then be closed source).

I am correct, the actual knowledge of the internals of the FPGA is in
the bitstream generator, so if a company does not want to expose that
information, it would be enough to use a closed-source tool for that.

Well free to correct me if I am wrong.

A company can *try* to keep their chip design info closed, but if a
bunch of amateurs can reverse engineer one company's devices, it
shouldn't be too hard to reverse engineer them all. It is a matter of
time and the need. To date no commercial effort has needed to reverse
engineer FPGA bitstreams. But a bunch of amateurs showed it could be
done.

The trouble with open source hardware is that having one made is not
inexpensive. For board level items it is not terribly practical unless
it is a niche item you just can't find elsewhere. For chip level
devices personal manufacturing is prohibitively expensive. Even decades
old technology is not cheap to have a minimum run made. Ask Green Arrays.

--

Rick C
 
Hi Jon,

On 24-02-17 17:35, Jon Elson wrote:
THEN, you get to the software side. You need to be able to take in all the
suboptimal and blatantly bad HDL that users will throw at it, and at least
give coherent error messages and not crash. There are probably so MANY ways
to write bad HDL that has unintended side effects, races and logic hazards.
While the FPGA is a massively repeated set of very simple identical cells,
the software has to treat it as thousands of different components once the
LUT patterns have been loaded. I'm GLAD somebody else is doing all this
work!

A couple of days ago, I found this video:
https://media.ccc.de/v/froscon2016-1817-how_to_design_your_own_chip

It's a very good overview on the tools that are out there for vlsi, fpga
design, testing-tools, ... and how hacker/maker/hobbyist friendly they
are, including some ways to get them foundries.

So there seams to be quite a few tools out there.
Concerning FPGAs, there seams to be two tools that go all the way up
generating a fpga bitstream. Wolf is now working on the "timing
verification" part for his tools.



To make your own FPGA, probably the BIGGEST minefield is the patent arena.
There must be thousands of current patents on FPGAs and FPGA-like devices
and tons of old prior art that could make patenting anything you design
problematic. Even if you had no intention of filing for a patent, you'd
have to design very carefully so as not to step on one of the "big boys"
patents. There is also lots of prtections files on software IP that you'd
have to avoid.

I think Rickman made a valid remark that quite some technologie is based
on patents that have expired.

This kind of reminds me of the "codec2" project
(http://www.rowetel.com/?page_id=452).
It was also thought that low-bitrates voice codecs where littered with
patents, but it turned out that it is possible to create a (now down-to
700 bps) voice codec just based on public domain knowledge.


Do mind, I am not saying that this is also the cases here, just it is
"to be verified".




Now, for me, the question that interests me is this:

For an fpga, you need two things: hardware and software.

If
- the software-component is now becoming available as open-source tools
(and let's make the assumtion that this will continue to be the case in
the near future),

and
- the hardware-component seams to be the less-difficult part of the
process (especially for companies that already design and make ICs, be
it not fpgas),

could the conclussion then be that -in a couple of years- will result in
more competition in the fpga-market?
(probably starting at the low-end part of the market)?


Or is this conclussion a bit to easy?


Jon
Cheerio! Kr. Bonne
 
Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?

I'd agree with this. I'd say the silicon these days is pretty great and the software is the limitation. It's not for lack of effort, though. I think most of the resources of an FPGA company go into the software side.

I don't quite understand the interest in an open-source FPGA. I don't believe it would be better than what we are getting commercially.

The areas that need the greatest improvement are open to development. There's nothing stopping somebody from making a better synthesizer. Convert HDL to structural code. You can still use this with commercial tools.
 
Am Samstag, 25. Februar 2017 18:50:45 UTC+1 schrieb Kevin Neilson:
Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.


Or is this conclussion a bit to easy?

I'd agree with this. I'd say the silicon these days is pretty great and the software is the limitation. It's not for lack of effort, though. I think most of the resources of an FPGA company go into the software side.

I don't quite understand the interest in an open-source FPGA. I don't believe it would be better than what we are getting commercially.

The areas that need the greatest improvement are open to development. There's nothing stopping somebody from making a better synthesizer. Convert HDL to structural code. You can still use this with commercial tools.

I basically fully agree (except maybe that also P&R might be an interesting topic which is pretty closed). But I want to add some further comments:

From a technical point, the software is the most difficult. While what Clifford does with YOSYS, etc. sounds extremely impressive (I have not never tested it yet - no time...), it will be next to impossible to find sufficient skilled programmers (in both programming and FPGA design) that contribute for free, IMHO.

From a practical point, however, the hardware is much more difficult, simply because it is incredible expensive. Of course you can use a big FPGA to simulate the "new" FPGA for a start. But at some point you will want to have real chips for the whole project to make sense. And then you have a multi-million Euro/dollar project...

You have to find a business case for the one how pays this millions (if you find someone at all...), and it will most likely be not open source?

The next difficult thing about hardware is (beside the patent topic others have already mentioned): Either you make a "me too" hardware, based on 4-inputs LUTs (the key patents have already expired, I guess) - or you invent some improved more or less radical new architecture. This would be a big (and interesting task) on it's own, of course... Maybe, at the end of Moore's law (?), a clever new architecture could bring a real benefit...

Business cases that come to my mind are things like:
- new FPGA vendor
- offer embedded FPGA technology for ASIC suppliers

You will need to be able support the customers, etc. and compete against the existing players. There is a reason why so many FPGA start-ups failed. But this does not mean that it cannot be done, of course... Personally I would find it cool to have an Austrian FPGA ;-)

Regards,

Thomas

www.entner-electronics.com - Home of EEBlaster and JPEG Codec
 
Hallo,



On 26-02-17 02:36, thomas.entner99@gmail.com wrote:
Am Samstag, 25. Februar 2017 18:50:45 UTC+1 schrieb Kevin Neilson:
Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.
Or is this conclussion a bit to easy?

I'd agree with this. I'd say the silicon these days is pretty great and the software is the limitation. It's not for lack of effort, though. I think most of the resources of an FPGA company go into the software side.
I don't quite understand the interest in an open-source FPGA. I don't believe it would be better than what we are getting commercially.
The areas that need the greatest improvement are open to development. There's nothing stopping somebody from making a better synthesizer. Convert HDL to structural code. You can still use this with commercial tools.

I basically fully agree (except maybe that also P&R might be an interesting topic which is pretty closed). But I want to add some further comments:
From a technical point, the software is the most difficult. While what Clifford does with YOSYS, etc. sounds extremely impressive (I have not never tested it yet - no time...), it will be next to impossible to find sufficient skilled programmers (in both programming and FPGA design) that contribute for free, IMHO.

Well, my guess is that the key to this is "information", or -to put it
otherwize- demythifying the technology.

If you look at the codec2 project (the low-bit voice codecĂ . It started
out as a one-person project of David , but he did a lot of effort to
really explain the technology and its core concepts. (there are quite a
few videos from presentation about this on the web).


It is not that this means you end up with hunderd of coders, but it does
really help to "demythify" a technology, provide a "starting point" for
who is interesed and get a few people around you.


Also keep in mind that more then 60 % of the code in (e.g.) the
linux-kernel is actually written by people who are on the payrole of big
IT companies.




> From a practical point, however, the hardware is much more difficult,
simply because it is incredible expensive. Of course you can use a big
FPGA to simulate the "new" FPGA for a start. But at some point you will
want to have real chips for the whole project to make sense. And then
you have a multi-million Euro/dollar project...
You have to find a business case for the one how pays this millions
(if you find someone at all...), and it will most likely be not open source?
The next difficult thing about hardware is (beside the patent topic
others have already mentioned): Either you make a "me too" hardware,
based on 4-inputs LUTs (the key patents have already expired, I guess) -
or you invent some improved more or less radical new architecture. This
would be a big (and interesting task) on it's own, of course... Maybe,
at the end of Moore's law (?), a clever new architecture could bring a
real benefit...
Business cases that come to my mind are things like:
- new FPGA vendor
- offer embedded FPGA technology for ASIC suppliers
You will need to be able support the customers, etc. and compete
against the existing players. There is a reason why so many FPGA
start-ups failed. But this does not mean that it cannot be done, of
course... Personally I would find it cool to have an Austrian FPGA ;-)


Perhaps the most logical community for this are the people who are now
involved in the risc-V CPU.


One of the ideas I find interesting what is done by a company called
"SiFive": they provide help to companies who want to integrate the
risc-V CPU in their own design. (1)


That's also why their business-model works very well with open-source
hardware model. Their product is not the CPU itself, nor a customised
versions of the risc-V, but a service: the process of integrating and
customising the risc-V for the customer.


In fact, the fact that actually making the hardware is so expensive is
-their this business model- not necessairy a bad thing. There are
relative few people are able to "steal" your effort and create CPUs to
compete with you. This makes donating your work back to the open-source
community less risky then in the software world.



(1) SiHive also offers the HiFive-1 (a RV32-based dev-kit), designed as
a way to let people get their hands on a risc-v CPU.



Regards,
Thomas
Cheerio! Kr. Bonne.
 
On 2/25/2017 8:36 PM, thomas.entner99@gmail.com wrote:
Am Samstag, 25. Februar 2017 18:50:45 UTC+1 schrieb Kevin Neilson:
Question, can I conclude from his remark that -if a hardware
companie would start out with designing a fpga- that the problem
is more the "software" side of things than the actual hardware
design of the chip.


Or is this conclussion a bit to easy?

I'd agree with this. I'd say the silicon these days is pretty
great and the software is the limitation. It's not for lack of
effort, though. I think most of the resources of an FPGA company
go into the software side.

I don't quite understand the interest in an open-source FPGA. I
don't believe it would be better than what we are getting
commercially.

The areas that need the greatest improvement are open to
development. There's nothing stopping somebody from making a
better synthesizer. Convert HDL to structural code. You can still
use this with commercial tools.

I basically fully agree (except maybe that also P&R might be an
interesting topic which is pretty closed). But I want to add some
further comments:

From a technical point, the software is the most difficult. While
what Clifford does with YOSYS, etc. sounds extremely impressive (I
have not never tested it yet - no time...), it will be next to
impossible to find sufficient skilled programmers (in both
programming and FPGA design) that contribute for free, IMHO.

From a practical point, however, the hardware is much more difficult,
simply because it is incredible expensive. Of course you can use a
big FPGA to simulate the "new" FPGA for a start. But at some point
you will want to have real chips for the whole project to make sense.
And then you have a multi-million Euro/dollar project...

You have to find a business case for the one how pays this millions
(if you find someone at all...), and it will most likely be not open
source?

The next difficult thing about hardware is (beside the patent topic
others have already mentioned): Either you make a "me too" hardware,
based on 4-inputs LUTs (the key patents have already expired, I
guess) - or you invent some improved more or less radical new
architecture. This would be a big (and interesting task) on it's own,
of course... Maybe, at the end of Moore's law (?), a clever new
architecture could bring a real benefit...

Business cases that come to my mind are things like: - new FPGA
vendor - offer embedded FPGA technology for ASIC suppliers

You will need to be able support the customers, etc. and compete
against the existing players. There is a reason why so many FPGA
start-ups failed. But this does not mean that it cannot be done, of
course... Personally I would find it cool to have an Austrian FPGA
;-)

There are lots of great chips out there and as has been done for a few
Lattice parts, they can be reverse engineered to get past the bitstream
issue.

What I would like to see is a different approach to the design software.
In the early days FPGA place and routing was done with a large amount
of hand work. As technology progressed the tools did better and better
designs. As designs got larger and larger it became essential that the
software would take over from the designer and essentially handle all
aspects of place and route.

I tend to work on smaller projects where it would be practical for the
designer to be more intimately involved in the placement and routing...
well, the placement anyway. Routing is no fun! I'd like to see tools
that allow construction of logical blocks using the device provided
primitives in a hierarchical fashion that supports easy placement
control. Once a good placement is found, auto-routing is made much
easier. Yes, this can be done in HDL in theory, but it is not so easy
and very verbose.

--

Rick C
 
On Friday, February 24, 2017 at 2:43:45 AM UTC-5, kristoff wrote:
... Wolf Clifford ... opensource fpga flow at ccc ...

Question, can I conclude from his remark that -if a hardware companie
would start out with designing a fpga- that the problem is more the
"software" side of things than the actual hardware design of the chip.

Or is this conclussion a bit to easy?

Kristoff, I am so happy that you started this thread. It has given me
much to think about.

Thank you,
Rick C. Hodgin
 
rickman wrote:

On 2/24/2017 11:30 PM, Jon Elson wrote:
How about designing your own ring oscillator?

I'm not sure how to even do that in an HDL.
Yes, I didn't do it, but we have a design that has a 33 MHz ring oscillator
that is some piece of IP. I never looked to see how it was coded. This is
on a Spartan 3AN part.


I suppose you can have a
second input to each inverter and set bits in a register to enable it.
But it would still have an combinatorial feedback path which would flag
an error in the timing analyzer unless you except that path. Where do
you see the problem?

Just that I would expect it to give the simulator indigestion!

I started out with CPLDs, doing schematic entry. Then, I moved to FPGAs,
and schematic entry worked, but led to a lot of maintenance hassles. I
finally saw the light, and learned VHDL.

Jon
 
On 2/27/2017 2:49 PM, Jon Elson wrote:
rickman wrote:

On 2/24/2017 11:30 PM, Jon Elson wrote:
How about designing your own ring oscillator?

I'm not sure how to even do that in an HDL.
Yes, I didn't do it, but we have a design that has a 33 MHz ring oscillator
that is some piece of IP. I never looked to see how it was coded. This is
on a Spartan 3AN part.


I suppose you can have a
second input to each inverter and set bits in a register to enable it.
But it would still have an combinatorial feedback path which would flag
an error in the timing analyzer unless you except that path. Where do
you see the problem?

Just that I would expect it to give the simulator indigestion!

Certainly the pre-layout simulation will not oscillate at 33 MHz.
Likely it will either not oscillate or will oscillate with delta delays
(zero time) unless they use specific features to assign delays in
simulation.

I still don't see why this is so hard to deal with in tools. The tools
either see correct inputs or not.


I started out with CPLDs, doing schematic entry. Then, I moved to FPGAs,
and schematic entry worked, but led to a lot of maintenance hassles. I
finally saw the light, and learned VHDL.

I know someone who adamantly insists Verilog is much more productive.
But every time I ask about a good reference book that will teach me how
to avoid the various pitfalls (learn from other's experience rather than
my own) of Verilog I'm told there isn't one. Go figure.

Why did you pick VHDL? Initially it is a PITA to learn. The strong
typing can really tie you up in knots.

--

Rick C
 
rickman wrote:


I know someone who adamantly insists Verilog is much more productive.
But every time I ask about a good reference book that will teach me how
to avoid the various pitfalls (learn from other's experience rather than
my own) of Verilog I'm told there isn't one. Go figure.

Why did you pick VHDL? Initially it is a PITA to learn. The strong
typing can really tie you up in knots.
My understanding is that if you are doing numerical algorithms like
cryptology, FFTs, image processing, and testing them in C, then it is MUCH
easier to convert them to Verilog.

If you are doing much more hardware-y type stuff, then VHDL may be more
direct.
I don't MIND strong typing, and automatic type conversions can really trip
you up. I rarely have to do explicit type conversions in VHDL, it does
allow a fair bit of automatic stuff. Like, you can assign an integer to a
bit vector without a type conversion.

I've never run into a type conversion that was not already provided by one
of the libraries.

I did do a stupid, do-nothing-tron project when learning VHDL to find out
how to write up some of the tricky things, like instantiating rows and
columns of my own defined blocks. So, I had a FF with an output multiplexer
and an input decoder that enabled the clock, and then instantiated a row of
10 of them, then 10 rows of those. So, in about 20 lines of VHDL I had 100
FFs with input and output selectors. I thought that was a pretty neat
accomplishment at the time.


Jon
 

Welcome to EDABoard.com

Sponsor

Back
Top