Digital Replacing Analog...

On Monday, August 31, 2020 at 1:26:01 PM UTC-4, Tom Gardner wrote:
On 31/08/20 16:44, Ricketty C wrote:
On Monday, August 31, 2020 at 3:50:40 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C
wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus
Kragelund wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty
C wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2,
Ricketty C wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4,
Klaus Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2,
Tim Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich





Quote

- everything being executed on a PROCESSOR will be considered
SOFTWARE and therefore be under IEC 62304, including software to be
executed on FPGA-processors, signal processors and graphics
boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can
do anything, and the software that programs them is an essential part
of the design.

In spite of the name, FPGAs are not programmed, they are configured.
They require no software to configure them since they are capable of
loading their own configuration either from on chip Flash, non-erasable
memory or external flash.

For some MCUs exactly the same is true. When they are initialised they load
their configuration from external flash non-erasable memory or external
flash.


That\'s the issue. There is no processor in an FPGA unless the user
designs one for it.

An FSM implemented in gates processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

An FSM implemented in an MCU processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

The languages used to specify the configuration are different, but both
require complex compilation to convert from input text to the binary bit
pattern. Compilers are not error free.

A FSM implemented with 7402 logic is just as much a processor by your
definition, yet not included in this category by anyone.

Correct, it is indeed a processor.

That processor implementation technology has its function
immutably cast in copper and doped silicon (i.e. wires and
gates).

It is not a processor in the normal use of the term which has to do with executing software. You have taken your own argument to the point of absurdity calling anything digital a \"processor\".


FPGAs and MCUs (unless mask programmed) have their function cast
in bits in memories which are initialised at power up, and can
(in unusual circumstances) change during operation. In addition
the bits are derived, possibly imperfectly, from an abstract
specification in a sequence of ASCII characters. That derivation
is itself error prone.

While FPGAs have their functionality stored in memory, unless that functionality is a processor the FPGA is not a processor.


I don\'t know why a few people here want to stretch this point until it
breaks. It is patently absurd to think of general logic in an FPGA being a
processor executing software

They are both a sequence of bits automatically derived from
ASCII characters and stored in memory.

Are you talking about the circuit boards I design??? That clearly applies to the schematic capture, layout and PWB fabrication processes.


Those similarities are fundamental, and very relevant to
reliability and verification.

But not in the way you are saying because the FPGA is not a processor.


any more than logic configured in any other way.
If the logic were designed using CAD tools from an HDL but implemented in SSI
functions we would not be having this discussion.

There would be the question of whether the compilation was correct.

There would be the question of whether the implementation had mutated
during operation due to flipped bits.

SSI logic is designed using automatic tools from ASCII files and uses memory elements that can have bits flipped.


There is nothing unique about FPGAs that make them \"processors\".

Correct.

But it is much easier to verify some implementation technologies
than others.

Whatever that means. If the requirements say \"processors\" that does not apply to FPGAs.

--

Rick C.

-+-- Get 1,000 miles of free Supercharging
-+-- Tesla referral code - https://ts.la/richard11209
 
On Monday, August 31, 2020 at 8:40:47 PM UTC+2, Ricketty C wrote:
On Monday, August 31, 2020 at 1:26:01 PM UTC-4, Tom Gardner wrote:
On 31/08/20 16:44, Ricketty C wrote:
On Monday, August 31, 2020 at 3:50:40 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C
wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus
Kragelund wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty
C wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2,
Ricketty C wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4,
Klaus Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2,
Tim Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich





Quote

- everything being executed on a PROCESSOR will be considered
SOFTWARE and therefore be under IEC 62304, including software to be
executed on FPGA-processors, signal processors and graphics
boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can
do anything, and the software that programs them is an essential part
of the design.

In spite of the name, FPGAs are not programmed, they are configured.
They require no software to configure them since they are capable of
loading their own configuration either from on chip Flash, non-erasable
memory or external flash.

For some MCUs exactly the same is true. When they are initialised they load
their configuration from external flash non-erasable memory or external
flash.


That\'s the issue. There is no processor in an FPGA unless the user
designs one for it.

An FSM implemented in gates processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

An FSM implemented in an MCU processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

The languages used to specify the configuration are different, but both
require complex compilation to convert from input text to the binary bit
pattern. Compilers are not error free.

A FSM implemented with 7402 logic is just as much a processor by your
definition, yet not included in this category by anyone.

Correct, it is indeed a processor.

That processor implementation technology has its function
immutably cast in copper and doped silicon (i.e. wires and
gates).

It is not a processor in the normal use of the term which has to do with executing software. You have taken your own argument to the point of absurdity calling anything digital a \"processor\".


FPGAs and MCUs (unless mask programmed) have their function cast
in bits in memories which are initialised at power up, and can
(in unusual circumstances) change during operation. In addition
the bits are derived, possibly imperfectly, from an abstract
specification in a sequence of ASCII characters. That derivation
is itself error prone.

While FPGAs have their functionality stored in memory, unless that functionality is a processor the FPGA is not a processor.

It is loading a configuration from a separate memory location, so the procedure is very much like a processor at boot

I don\'t know why a few people here want to stretch this point until it
breaks. It is patently absurd to think of general logic in an FPGA being a
processor executing software

They are both a sequence of bits automatically derived from
ASCII characters and stored in memory.

Are you talking about the circuit boards I design??? That clearly applies to the schematic capture, layout and PWB fabrication processes.


Those similarities are fundamental, and very relevant to
reliability and verification.

But not in the way you are saying because the FPGA is not a processor.


any more than logic configured in any other way.
If the logic were designed using CAD tools from an HDL but implemented in SSI
functions we would not be having this discussion.

There would be the question of whether the compilation was correct.

There would be the question of whether the implementation had mutated
during operation due to flipped bits.

SSI logic is designed using automatic tools from ASCII files and uses memory elements that can have bits flipped.

Yes, but a SSI design can be reviewed by just looking at it, writing up the logic equations. Same for the PCB, you can inspect the design directly

For a FPGA with 100k config registers, I am guessing you do not sit down and dissect it all

If the design has flip flops, you can have bits changed, and you would probably need voting mechanism and ways to do periodic resets

There is nothing unique about FPGAs that make them \"processors\".

Correct.

But it is much easier to verify some implementation technologies
than others.

Whatever that means. If the requirements say \"processors\" that does not apply to FPGAs.

About the tool chains, it is well known that they generate different code from version to version and from chain to chain (Keil, IAR, GCC)

Cheers

Klaus
 
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund wrote:
On Monday, August 31, 2020 at 8:40:47 PM UTC+2, Ricketty C wrote:
On Monday, August 31, 2020 at 1:26:01 PM UTC-4, Tom Gardner wrote:
On 31/08/20 16:44, Ricketty C wrote:
On Monday, August 31, 2020 at 3:50:40 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C
wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus
Kragelund wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty
C wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2,
Ricketty C wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4,
Klaus Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2,
Tim Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich





Quote

- everything being executed on a PROCESSOR will be considered
SOFTWARE and therefore be under IEC 62304, including software to be
executed on FPGA-processors, signal processors and graphics
boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can
do anything, and the software that programs them is an essential part
of the design.

In spite of the name, FPGAs are not programmed, they are configured.
They require no software to configure them since they are capable of
loading their own configuration either from on chip Flash, non-erasable
memory or external flash.

For some MCUs exactly the same is true. When they are initialised they load
their configuration from external flash non-erasable memory or external
flash.


That\'s the issue. There is no processor in an FPGA unless the user
designs one for it.

An FSM implemented in gates processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

An FSM implemented in an MCU processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

The languages used to specify the configuration are different, but both
require complex compilation to convert from input text to the binary bit
pattern. Compilers are not error free.

A FSM implemented with 7402 logic is just as much a processor by your
definition, yet not included in this category by anyone.

Correct, it is indeed a processor.

That processor implementation technology has its function
immutably cast in copper and doped silicon (i.e. wires and
gates).

It is not a processor in the normal use of the term which has to do with executing software. You have taken your own argument to the point of absurdity calling anything digital a \"processor\".


FPGAs and MCUs (unless mask programmed) have their function cast
in bits in memories which are initialised at power up, and can
(in unusual circumstances) change during operation. In addition
the bits are derived, possibly imperfectly, from an abstract
specification in a sequence of ASCII characters. That derivation
is itself error prone.

While FPGAs have their functionality stored in memory, unless that functionality is a processor the FPGA is not a processor.


It is loading a configuration from a separate memory location, so the procedure is very much like a processor at boot

It can be liked a processor in many ways and still not be a processor. Copying memory is not a requirement of being a processor.


I don\'t know why a few people here want to stretch this point until it
breaks. It is patently absurd to think of general logic in an FPGA being a
processor executing software

They are both a sequence of bits automatically derived from
ASCII characters and stored in memory.

Are you talking about the circuit boards I design??? That clearly applies to the schematic capture, layout and PWB fabrication processes.


Those similarities are fundamental, and very relevant to
reliability and verification.

But not in the way you are saying because the FPGA is not a processor.


any more than logic configured in any other way.
If the logic were designed using CAD tools from an HDL but implemented in SSI
functions we would not be having this discussion.

There would be the question of whether the compilation was correct.

There would be the question of whether the implementation had mutated
during operation due to flipped bits.

SSI logic is designed using automatic tools from ASCII files and uses memory elements that can have bits flipped.

Yes, but a SSI design can be reviewed by just looking at it, writing up the logic equations. Same for the PCB, you can inspect the design directly

You mean using software to view ASCII text files that describe the schematic? Or using software to view ASCII text files that describe the board layout? I don\'t know anyone who actually traces the copper on a PWB as the means of verifying the board is designed properly. They use software, lots of wonderful, unqualified software to view and automatically verify the design. Every layout package I\'ve ever seen provides a check for unrouted nets.... which does not work 100% all of the time. I\'ve seen software that fails on checking for connections via power pours.


> For a FPGA with 100k config registers, I am guessing you do not sit down and dissect it all

I verify my designs in simulation using software. Yup, these days you can\'t even create an installation procedure without software. It was a faulty installation procedure that caused a head gasket to blow on one of the generators of the local nuclear plant when the plant was scrammed during an earthquake. It was only chance that the other generators did not fail.

Was the single point of failure the software used to produce the installation procedure? I don\'t know. They\'ve never said what was wrong or why it was wrong. But the software did nothing to prevent the problem.

The issues you are talking about with FPGAs only need to be less likely than the human errors that are behind most accidents. But regardless, none of this makes an FPGA into a processor unless you design a processor into it.


> If the design has flip flops, you can have bits changed, and you would probably need voting mechanism and ways to do periodic resets

Or not. Are you suggesting you need three FFs and voting logic for every SSI level circuit FF?


There is nothing unique about FPGAs that make them \"processors\".

Correct.

But it is much easier to verify some implementation technologies
than others.

Whatever that means. If the requirements say \"processors\" that does not apply to FPGAs.


About the tool chains, it is well known that they generate different code from version to version and from chain to chain (Keil, IAR, GCC)

Yup. Even different revisions of the same tools produce different outputs. The same is true for FPGAs as well. Different revisions of KiCAD will produce different files for the same schematic.

None of this is relevant to the point of FPGAs not being processors... they aren\'t.

--

Rick C.

-+-+ Get 1,000 miles of free Supercharging
-+-+ Tesla referral code - https://ts.la/richard11209
 
mandag den 31. august 2020 kl. 21.42.08 UTC+2 skrev Klaus Kragelund:
On Monday, August 31, 2020 at 8:40:47 PM UTC+2, Ricketty C wrote:
On Monday, August 31, 2020 at 1:26:01 PM UTC-4, Tom Gardner wrote:
On 31/08/20 16:44, Ricketty C wrote:
On Monday, August 31, 2020 at 3:50:40 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C
wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus
Kragelund wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty
C wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2,
Ricketty C wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4,
Klaus Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2,
Tim Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich





Quote

- everything being executed on a PROCESSOR will be considered
SOFTWARE and therefore be under IEC 62304, including software to be
executed on FPGA-processors, signal processors and graphics
boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can
do anything, and the software that programs them is an essential part
of the design.

In spite of the name, FPGAs are not programmed, they are configured.
They require no software to configure them since they are capable of
loading their own configuration either from on chip Flash, non-erasable
memory or external flash.

For some MCUs exactly the same is true. When they are initialised they load
their configuration from external flash non-erasable memory or external
flash.


That\'s the issue. There is no processor in an FPGA unless the user
designs one for it.

An FSM implemented in gates processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

An FSM implemented in an MCU processes inputs according to a configuration
stored in an external memory (except mask programmed or one time
programmable devices).

The languages used to specify the configuration are different, but both
require complex compilation to convert from input text to the binary bit
pattern. Compilers are not error free.

A FSM implemented with 7402 logic is just as much a processor by your
definition, yet not included in this category by anyone.

Correct, it is indeed a processor.

That processor implementation technology has its function
immutably cast in copper and doped silicon (i.e. wires and
gates).

It is not a processor in the normal use of the term which has to do with executing software. You have taken your own argument to the point of absurdity calling anything digital a \"processor\".


FPGAs and MCUs (unless mask programmed) have their function cast
in bits in memories which are initialised at power up, and can
(in unusual circumstances) change during operation. In addition
the bits are derived, possibly imperfectly, from an abstract
specification in a sequence of ASCII characters. That derivation
is itself error prone.

While FPGAs have their functionality stored in memory, unless that functionality is a processor the FPGA is not a processor.


It is loading a configuration from a separate memory location, so the procedure is very much like a processor at boot


I don\'t know why a few people here want to stretch this point until it
breaks. It is patently absurd to think of general logic in an FPGA being a
processor executing software

They are both a sequence of bits automatically derived from
ASCII characters and stored in memory.

Are you talking about the circuit boards I design??? That clearly applies to the schematic capture, layout and PWB fabrication processes.


Those similarities are fundamental, and very relevant to
reliability and verification.

But not in the way you are saying because the FPGA is not a processor.


any more than logic configured in any other way.
If the logic were designed using CAD tools from an HDL but implemented in SSI
functions we would not be having this discussion.

There would be the question of whether the compilation was correct.

There would be the question of whether the implementation had mutated
during operation due to flipped bits.

SSI logic is designed using automatic tools from ASCII files and uses memory elements that can have bits flipped.

Yes, but a SSI design can be reviewed by just looking at it, writing up the logic equations. Same for the PCB, you can inspect the design directly

For a FPGA with 100k config registers, I am guessing you do not sit down and dissect it all

If the design has flip flops, you can have bits changed, and you would probably need voting mechanism and ways to do periodic resets

I\'ve seen the same guy in the article you posted and others, say
that if all possible output can be verified by brute force testing all combinations of inputs it it need not be treated as software
 
On Monday, August 31, 2020 at 6:37:11 PM UTC-4, Lasse Langwadt Christensen wrote:
mandag den 31. august 2020 kl. 21.42.08 UTC+2 skrev Klaus Kragelund:

If the design has flip flops, you can have bits changed, and you would probably need voting mechanism and ways to do periodic resets


I\'ve seen the same guy in the article you posted and others, say
that if all possible output can be verified by brute force testing all combinations of inputs it it need not be treated as software

If the design contains sequential logic that gets a lot harder. I assume that \"all possible\" combinations don\'t need to be tested, only the ones that are connected since \"all possible\" would need to also include every combination of FF states whether or not they are intended.

Back some years ago it was a hot research topic to figure out how to generate test vectors for designs and how to determine coverage and testability. For a complex design it can be a complex issue, but not the same thing as verifying a design does what you have stated in the requirements. Even harder is to verify the design does what you desire.

--

Rick C.

-++- Get 1,000 miles of free Supercharging
-++- Tesla referral code - https://ts.la/richard11209
 
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund wrote:
For a FPGA with 100k config registers, I am guessing you do not sit down
and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.
 
On 01/09/20 00:25, Ricketty C wrote:
On Monday, August 31, 2020 at 6:37:11 PM UTC-4, Lasse Langwadt Christensen
wrote:
mandag den 31. august 2020 kl. 21.42.08 UTC+2 skrev Klaus Kragelund:

If the design has flip flops, you can have bits changed, and you would
probably need voting mechanism and ways to do periodic resets


I\'ve seen the same guy in the article you posted and others, say that if
all possible output can be verified by brute force testing all combinations
of inputs it it need not be treated as software

If the design contains sequential logic that gets a lot harder. I assume
that \"all possible\" combinations don\'t need to be tested, only the ones that
are connected since \"all possible\" would need to also include every
combination of FF states whether or not they are intended.

Back some years ago it was a hot research topic to figure out how to generate
test vectors for designs and how to determine coverage and testability.

And also to determine which manufacturing defects should be tested.
It used to be nodes stuck-at-1 and stuck-at-0; I don\'t know what the
latest models are.

FPGAs are easier that hardwired logic, since you only need to test
the generic \"uncommitted\" LUT and interconnections, not the specific
ones in the customer\'s design.


For
a complex design it can be a complex issue, but not the same thing as
verifying a design does what you have stated in the requirements.

It is entirely different, which beginners often don\'t appreciate.


Even
harder is to verify the design does what you desire.

That\'s design validation, which is yet another thing.

\"Do the right thing\"
\"Do the thing right\"
\"Duplicate the thing exactly\"
 
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund wrote:
For a FPGA with 100k config registers, I am guessing you do not sit down
and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered? My understanding is that these are issues in logic simulators... where the standard assumption is that all logic is perfectly fast and only the logic is being verified before any synthesis is done. There are other tools to verify the actual timing of the implementation as a prelude to testing in the lab..

I did meet a designer giving a presentation at work who claimed that timing was best verified by timing simulations. I asked how he proposed designing the test bench to verify all paths through all logic and his reply was something equivalent to \"very carefully\".

--

Rick C.

-+++ Get 1,000 miles of free Supercharging
-+++ Tesla referral code - https://ts.la/richard11209
 
Ricketty C wrote:

> None of this is relevant to the point of FPGAs not being processors... they aren\'t.

Who exactly are you arguing with? With Klaus or the Approval Body? How
is this going to help with the latter? Will you be able to outrant them
and get a medical device approval? If so, best of British luck!

Best regards, Piotr
 
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund wrote:
For a FPGA with 100k config registers, I am guessing you do not sit
down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed, I
created an implementation. It simulated correctly in an inertial delay
simulator (HiLo), but correctly showed a problem in a transport delay
simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years salary
(it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of the
simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are
somewhat hazy.

The core point was that an internal dynamic hazard glitch was
theoretically possible, even in a nominally fully synchronous
design.

Once that possibility was recognised, I corrected my design fault.


My
understanding is that these are issues in logic simulators... where the
standard assumption is that all logic is perfectly fast and only the logic is
being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience
asks the question. If that was the case:
- how could the simulator indicate a pre-P&R or post-P&R
max clock speed?
- how could the P&R choose which paths to optimise?


There are other tools to verify
the actual timing of the implementation as a prelude to testing in the lab.

In FPGAs and similar I\'ve only seen digital logic simulators that
include the timing. I don\'t see much use for a digital logic simulator
that doesn\'t.
 
On Tuesday, September 1, 2020 at 4:28:17 AM UTC-4, Tom Gardner wrote:
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund wrote:
For a FPGA with 100k config registers, I am guessing you do not sit
down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed, I
created an implementation. It simulated correctly in an inertial delay
simulator (HiLo), but correctly showed a problem in a transport delay
simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years salary
(it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of the
simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are
somewhat hazy.

The core point was that an internal dynamic hazard glitch was
theoretically possible, even in a nominally fully synchronous
design.

Once that possibility was recognised, I corrected my design fault.

Without more details I can\'t evaluate your issue. But I don\'t know of any tools that are perfect. So what is the point of this line of thought exactly?


My
understanding is that these are issues in logic simulators... where the
standard assumption is that all logic is perfectly fast and only the logic is
being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience
asks the question. If that was the case:
- how could the simulator indicate a pre-P&R or post-P&R
max clock speed?
- how could the P&R choose which paths to optimise?

Asks what question??? A simulator is not used to measure clock speeds. Anything pre P&R is just a wild guess. After P&R it is a a worst case analysis... that\'s not always right. In particular I worked on a project where they knew the tools were broken. So we had to test the design on the bench with a heater to try to make it fail... sometimes when the timing wasn\'t quite bad enough to fail at room temperature. We generated multiple trials every night and tested the best one until we got one to work no matter what we did to it.


There are other tools to verify
the actual timing of the implementation as a prelude to testing in the lab.

In FPGAs and similar I\'ve only seen digital logic simulators that
include the timing. I don\'t see much use for a digital logic simulator
that doesn\'t.

I have no idea why you say that. If you are using a simulator to analyze your timing, I can understand why you think FPGA design is a PITA. Timing is analyzed by a... timing analysis tool. It\'s a matter of adding up all the many combinations of paths and comparing them to the timing requirements. It\'s an exhaustive search and either all paths pass or you get errors that are up to you to figure out how to fix.

It would be nearly impossible to cover all the paths in a simulation and it would require a special tool to analyzed how much of the design was covered by any given test.

Maybe by timing, you mean they tick off the time that is marked by the clocks in the system. Yeah, they do that, but there are no delays in the signal paths unless you go to great pains to add them. Before place and route there is no accurate timing information to work with.

Use the right tool for the right job. Use simulation to verify the logic is correct. Then synthesize the design and go to work on the timing verification.

--

Rick C.

+--- Get 1,000 miles of free Supercharging
+--- Tesla referral code - https://ts.la/richard11209
 
On 01/09/20 10:22, Ricketty C wrote:
On Tuesday, September 1, 2020 at 4:28:17 AM UTC-4, Tom Gardner wrote:
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund
wrote:
For a FPGA with 100k config registers, I am guessing you do not
sit down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are somewhat
hazy.

The core point was that an internal dynamic hazard glitch was theoretically
possible, even in a nominally fully synchronous design.

Once that possibility was recognised, I corrected my design fault.

Without more details I can\'t evaluate your issue.

What makes you think your evaluating an issue that occurred
in ~1983 would be beneficial?


But I don\'t know of any tools that are perfect. So what is the point of this
line of thought exactly?

I refer you to the first line of my response dated
On Monday, August 31, 2020 at 7:29:25


My understanding is that these are issues in logic simulators... where
the standard assumption is that all logic is perfectly fast and only the
logic is being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience asks the
question. If that was the case: - how could the simulator indicate a
pre-P&R or post-P&R max clock speed? - how could the P&R choose which paths
to optimise?

Asks what question???

I should have said \"makes that statement/assertion\".


A simulator is not used to measure clock speeds.
Anything pre P&R is just a wild guess. After P&R it is a a worst case
analysis... that\'s not always right.

So you recognise that \"the standard assumption is that all
logic is perfectly fast\" isn\'t actually the case.

Good.

It is difficult (and pointless) to assess what you might mean
by \"wild\".


In particular I worked on a project
where they knew the tools were broken. So we had to test the design on the
bench with a heater to try to make it fail... sometimes when the timing
wasn\'t quite bad enough to fail at room temperature. We generated multiple
trials every night and tested the best one until we got one to work no matter
what we did to it.

And what could you /guarantee/ after that rather suboptimal process?



There are other tools to verify the actual timing of the implementation
as a prelude to testing in the lab.

In FPGAs and similar I\'ve only seen digital logic simulators that include
the timing. I don\'t see much use for a digital logic simulator that
doesn\'t.

I have no idea why you say that. If you are using a simulator to analyze
your timing, I can understand why you think FPGA design is a PITA. Timing is
analyzed by a... timing analysis tool. It\'s a matter of adding up all the
many combinations of paths and comparing them to the timing requirements.
It\'s an exhaustive search and either all paths pass or you get errors that
are up to you to figure out how to fix.

What makes you assert that I think FPGA design is a PITA?
 
On Tuesday, September 1, 2020 at 5:42:26 AM UTC-4, Tom Gardner wrote:
On 01/09/20 10:22, Ricketty C wrote:
On Tuesday, September 1, 2020 at 4:28:17 AM UTC-4, Tom Gardner wrote:
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund
wrote:
For a FPGA with 100k config registers, I am guessing you do not
sit down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are somewhat
hazy.

The core point was that an internal dynamic hazard glitch was theoretically
possible, even in a nominally fully synchronous design.

Once that possibility was recognised, I corrected my design fault.

Without more details I can\'t evaluate your issue.

What makes you think your evaluating an issue that occurred
in ~1983 would be beneficial?


But I don\'t know of any tools that are perfect. So what is the point of this
line of thought exactly?

I refer you to the first line of my response dated
On Monday, August 31, 2020 at 7:29:25


My understanding is that these are issues in logic simulators... where
the standard assumption is that all logic is perfectly fast and only the
logic is being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience asks the
question. If that was the case: - how could the simulator indicate a
pre-P&R or post-P&R max clock speed? - how could the P&R choose which paths
to optimise?

Asks what question???

I should have said \"makes that statement/assertion\".

We seem to be making no headway. I don\'t know what experience you actually have with digital design, but if you don\'t understand anything I am talking about we have no common ground to continue this discussion.


A simulator is not used to measure clock speeds.
Anything pre P&R is just a wild guess. After P&R it is a a worst case
analysis... that\'s not always right.

So you recognise that \"the standard assumption is that all
logic is perfectly fast\" isn\'t actually the case.

Good.

It is difficult (and pointless) to assess what you might mean
by \"wild\".

Yes, I think if you don\'t know what is meant by a \"wild guess\" then we truly have no common ground for discussion.


In particular I worked on a project
where they knew the tools were broken. So we had to test the design on the
bench with a heater to try to make it fail... sometimes when the timing
wasn\'t quite bad enough to fail at room temperature. We generated multiple
trials every night and tested the best one until we got one to work no matter
what we did to it.

And what could you /guarantee/ after that rather suboptimal process?

There are NO guarantees in life. We shipped product. This was a company that provided test equipment to the majority of the telecom manufacturers in the world.


There are other tools to verify the actual timing of the implementation
as a prelude to testing in the lab.

In FPGAs and similar I\'ve only seen digital logic simulators that include
the timing. I don\'t see much use for a digital logic simulator that
doesn\'t.

I have no idea why you say that. If you are using a simulator to analyze
your timing, I can understand why you think FPGA design is a PITA. Timing is
analyzed by a... timing analysis tool. It\'s a matter of adding up all the
many combinations of paths and comparing them to the timing requirements.
It\'s an exhaustive search and either all paths pass or you get errors that
are up to you to figure out how to fix.

What makes you assert that I think FPGA design is a PITA?

So you don\'t deny it?

I can\'t say for sure what you are doing, buy if you are analyzing timing through simulation, I\'m pretty sure you have lots of timing bugs or you spend *way* too much time in simulation looking for them or both.

I thought I\'ve explained the preferred process pretty clearly. If you have any questions feel free to ask.

--

Rick C.

+--+ Get 1,000 miles of free Supercharging
+--+ Tesla referral code - https://ts.la/richard11209
 
On Tuesday, September 1, 2020 at 12:06:15 PM UTC+2, Ricketty C wrote:
On Tuesday, September 1, 2020 at 5:42:26 AM UTC-4, Tom Gardner wrote:
On 01/09/20 10:22, Ricketty C wrote:
On Tuesday, September 1, 2020 at 4:28:17 AM UTC-4, Tom Gardner wrote:
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund
wrote:
For a FPGA with 100k config registers, I am guessing you do not
sit down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are somewhat
hazy.

The core point was that an internal dynamic hazard glitch was theoretically
possible, even in a nominally fully synchronous design.

Once that possibility was recognised, I corrected my design fault.

Without more details I can\'t evaluate your issue.

What makes you think your evaluating an issue that occurred
in ~1983 would be beneficial?


But I don\'t know of any tools that are perfect. So what is the point of this
line of thought exactly?

I refer you to the first line of my response dated
On Monday, August 31, 2020 at 7:29:25


My understanding is that these are issues in logic simulators... where
the standard assumption is that all logic is perfectly fast and only the
logic is being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience asks the
question. If that was the case: - how could the simulator indicate a
pre-P&R or post-P&R max clock speed? - how could the P&R choose which paths
to optimise?

Asks what question???

I should have said \"makes that statement/assertion\".

We seem to be making no headway. I don\'t know what experience you actually have with digital design, but if you don\'t understand anything I am talking about we have no common ground to continue this discussion.

Correct, no progress

I suggest you approach the Approval Body like Piotr suggested

You should do that anyway, no point in digging to deep before you know what you need to comply to. My guess is that you will have an eyeopening experience :)

Cheers

Klaus
 
On 01/09/20 11:29, Klaus Kragelund wrote:
On Tuesday, September 1, 2020 at 12:06:15 PM UTC+2, Ricketty C wrote:
On Tuesday, September 1, 2020 at 5:42:26 AM UTC-4, Tom Gardner wrote:
On 01/09/20 10:22, Ricketty C wrote:
On Tuesday, September 1, 2020 at 4:28:17 AM UTC-4, Tom Gardner wrote:
On 01/09/20 05:55, Ricketty C wrote:
On Monday, August 31, 2020 at 7:29:25 PM UTC-4, Tom Gardner wrote:
On 31/08/20 23:26, Ricketty C wrote:
On Monday, August 31, 2020 at 3:42:08 PM UTC-4, Klaus Kragelund
wrote:
For a FPGA with 100k config registers, I am guessing you do not
sit down and dissect it all

I verify my designs in simulation using software.

And there\'s /another/ source of problems.

Does the simulator you use model inertial delays or transport delays?

I\'m sensitive to that since almost 40 years ago, before FPGAs existed,
I created an implementation. It simulated correctly in an inertial
delay simulator (HiLo), but correctly showed a problem in a transport
delay simulator (Tegas).

Spotting that infelicity saved 3 months turnaround time and a years
salary (it was a semi-custom design).

Those details don\'t matter; what matters is that the correctness of
the simulator has to be questioned - and answered.

What were you doing that the transport and inertial delays mattered?

That would take too long to explain, and the precise details are somewhat
hazy.

The core point was that an internal dynamic hazard glitch was theoretically
possible, even in a nominally fully synchronous design.

Once that possibility was recognised, I corrected my design fault.

Without more details I can\'t evaluate your issue.

What makes you think your evaluating an issue that occurred
in ~1983 would be beneficial?


But I don\'t know of any tools that are perfect. So what is the point of this
line of thought exactly?

I refer you to the first line of my response dated
On Monday, August 31, 2020 at 7:29:25


My understanding is that these are issues in logic simulators... where
the standard assumption is that all logic is perfectly fast and only the
logic is being verified before any synthesis is done.

I doubt that, and I\'m surprised someone with your experience asks the
question. If that was the case: - how could the simulator indicate a
pre-P&R or post-P&R max clock speed? - how could the P&R choose which paths
to optimise?

Asks what question???

I should have said \"makes that statement/assertion\".

We seem to be making no headway. I don\'t know what experience you actually have with digital design, but if you don\'t understand anything I am talking about we have no common ground to continue this discussion.


Correct, no progress

I suggest you approach the Approval Body like Piotr suggested

You should do that anyway, no point in digging to deep before you know what you need to comply to. My guess is that you will have an eyeopening experience :)

Just so, and I suspect the result will be a \"stomach sinking\"
or \"bowel opening\" experience :)
 
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus Kragelund
wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty C
wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2, Ricketty C
wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2, Tim
Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich



Quote

- everything being executed on a PROCESSOR will be considered SOFTWARE
and therefore be under IEC 62304, including software to be executed on
FPGA-processors, signal processors and graphics boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can do
anything, and the software that programs them is an essential part of the
design.

In spite of the name, FPGAs are not programmed, they are configured. They
require no software to configure them since they are capable of loading their
own configuration either from on chip Flash, non-erasable memory or external
flash.


More importantly, what you are failing to acknowledge is that not every
aspect of the device has to be dealt with in the same way. Not every
aspect of the design is safety critical. A risk assessment is part of the
process.

And if you think that the string of text that programs the FPGA isn\'t
software, you don\'t seem to be in close enough touch with reality to be a
particularly reliable risk assessor.

The schematic I use to design my PCBs are just text as well. Yet no one
claims the board of analog circuitry is a processor.

That\'s the issue. There is no processor in an FPGA unless the user designs
one for it.

This isn\'t rocket science. A processor has a definition. Look it up. If
the FPGA is a processor, it\'s not very Turing complete. While an FPGA can be
configured to be a processor, very few of them are. If they aren\'t
configured to be a processor, they aren\'t processors.

Please stop being silly about this.

This is a neat and commercially important example of your limited
imagination and understanding; The HP 94332 Nanoprocessor.

Given its characteristics, do you think it is a processor or not?




16 8-bit registers
Vectored interrupts 700ns
Instruction cycle time 400ns
42 8-bit instructions
11 bit PC
but
no ALU
no arithmetic other than increment/decrement

\"The Nanoprocessor was more of a state-machine controller than a microprocessor.\"

\"This code is from the interrupt handler that increases the time
and date every second. The code below determines
....
the corresponding opcode, and my description of the instruction.
....
This code demonstrates that even though a processor without
addition sounds useless, the Nanoprocessor\'s bit operations and
increment/decrement allow more computation than you\'d expect.
It also shows that Nanoprocessor code is compact and efficient.
Many things can be done in a single byte (such as bit test and
skip) that would take multiple bytes on other processors.
The Nanoprocessor\'s large register file also avoids much of the
tedious shuffling of data back and forth often required in other
processors. Although some call the Nanoprocessor more of a
state machine controller than a microprocessor, that understates
the capabilities and role of the Nanoprocessor.\"

http://www.cpushack.com/2020/08/09/the-forgotten-ones-hp-nanoprocessor/
http://www.righto.com/2020/09/inside-hp-nanoprocessor-high-speed.html
 
On Thursday, September 3, 2020 at 4:48:48 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus Kragelund
wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty C
wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2, Ricketty C
wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2, Tim
Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich



Quote

- everything being executed on a PROCESSOR will be considered SOFTWARE
and therefore be under IEC 62304, including software to be executed on
FPGA-processors, signal processors and graphics boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can do
anything, and the software that programs them is an essential part of the
design.

In spite of the name, FPGAs are not programmed, they are configured. They
require no software to configure them since they are capable of loading their
own configuration either from on chip Flash, non-erasable memory or external
flash.


More importantly, what you are failing to acknowledge is that not every
aspect of the device has to be dealt with in the same way. Not every
aspect of the design is safety critical. A risk assessment is part of the
process.

And if you think that the string of text that programs the FPGA isn\'t
software, you don\'t seem to be in close enough touch with reality to be a
particularly reliable risk assessor.

The schematic I use to design my PCBs are just text as well. Yet no one
claims the board of analog circuitry is a processor.

That\'s the issue. There is no processor in an FPGA unless the user designs
one for it.

This isn\'t rocket science. A processor has a definition. Look it up. If
the FPGA is a processor, it\'s not very Turing complete. While an FPGA can be
configured to be a processor, very few of them are. If they aren\'t
configured to be a processor, they aren\'t processors.

Please stop being silly about this.

This is a neat and commercially important example of your limited
imagination and understanding; The HP 94332 Nanoprocessor.

Given its characteristics, do you think it is a processor or not?

16 8-bit registers
Vectored interrupts 700ns
Instruction cycle time 400ns
42 8-bit instructions
11 bit PC
but
no ALU
no arithmetic other than increment/decrement

\"The Nanoprocessor was more of a state-machine controller than a microprocessor.\"

\"This code is from the interrupt handler that increases the time
and date every second. The code below determines
...
the corresponding opcode, and my description of the instruction.
...
This code demonstrates that even though a processor without
addition sounds useless, the Nanoprocessor\'s bit operations and
increment/decrement allow more computation than you\'d expect.
It also shows that Nanoprocessor code is compact and efficient.
Many things can be done in a single byte (such as bit test and
skip) that would take multiple bytes on other processors.
The Nanoprocessor\'s large register file also avoids much of the
tedious shuffling of data back and forth often required in other
processors. Although some call the Nanoprocessor more of a
state machine controller than a microprocessor, that understates
the capabilities and role of the Nanoprocessor.\"

http://www.cpushack.com/2020/08/09/the-forgotten-ones-hp-nanoprocessor/
http://www.righto.com/2020/09/inside-hp-nanoprocessor-high-speed.html

So we are attempting to define \"processor\" by example? Yes, I consider this to be a processor because it executes a program stored in memory, generated from code whose use is only to specify the instructions.

But this is not about what I consider to be a processor. The issue is what does the standard consider to be a processor? No?

I suppose the standard is flexible enough to give you the rope to hang yourself. It sounds like a lot of adherence to the spec requires the developers to make their own definitions and to justify those definitions.

A lot of people are criticizing this project as being unrealistic in consideration of the difficulties of getting medical equipment approved. I don\'t disagree with that assessment. But that doesn\'t mean I should abandon the effort rather than putting time and interest into doing as good a job as I can. 6 weeks ago I was the guy trying to get them to write requirements for the hardware in addition to the software in the face of \"we don\'t have time to do it right\" sort of severe resistance.

At this point I am doing most of the electronic design and am making a requirements document for the FPGA which I won\'t consider coding until it is complete and the stake holders have signed off on. The project leader does not understand the issue of measuring twice so we can cut once. You literally would not believe the number and severity of issues with the first revs of the control board.

I was about to exit the project because I thought my role was about done. Now it looks like I\'m the lead hardware designer and my biggest problem will be debugging the thing without equipment. At least I have a good relationship with the software guys. If I get my hands on a machine I\'ll be installing a Forth interpreter so I can get it to do what I need.

Right now I\'m trying to figure out how to calibrate the sensors in the use cases for the machine. I think it will need two calibration modes. One to calibrate the offset of the various sensors which can be one without special equipment. Another calibration mode will be required to fully calibrate the O2 sensor at normal O2 content and 100% O2. This requires the use of an O2 supply, so it is a separate calibration. It\'s required because the O2 sensor has a limited lifetime and fails over a 2 to 4 week period. So this calibration needs to be done at least every 2 weeks, really every week. The other cal can be done more easily and only takes a minute, so should be done every day.

--

Rick C.

+-+- Get 1,000 miles of free Supercharging
+-+- Tesla referral code - https://ts.la/richard11209
 
On 03/09/20 16:22, Ricketty C wrote:
On Thursday, September 3, 2020 at 4:48:48 AM UTC-4, Tom Gardner wrote:
On 31/08/20 03:38, Ricketty C wrote:
On Sunday, August 30, 2020 at 10:18:24 PM UTC-4, Bill Sloman wrote:
On Monday, August 31, 2020 at 10:03:28 AM UTC+10, Ricketty C wrote:
On Sunday, August 30, 2020 at 6:07:04 PM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 11:06:23 AM UTC+2, Ricketty C wrote:
On Sunday, August 30, 2020 at 4:45:53 AM UTC-4, Klaus Kragelund
wrote:
On Sunday, August 30, 2020 at 12:35:08 AM UTC+2, Ricketty C
wrote:
On Saturday, August 29, 2020 at 6:20:30 PM UTC-4, Klaus
Kragelund wrote:
On Saturday, August 29, 2020 at 3:32:52 AM UTC+2, Ricketty
C wrote:
On Friday, August 28, 2020 at 7:08:41 PM UTC-4, Klaus
Kragelund wrote:
On Friday, August 28, 2020 at 10:07:35 AM UTC+2,
Ricketty C wrote:
On Friday, August 28, 2020 at 2:23:21 AM UTC-4,
Klaus Kragelund wrote:
On Friday, August 28, 2020 at 4:50:09 AM UTC+2,
Tim Williams wrote:
\"Ricketty C\" <gnuarm.del...@gmail.com> wrote in
message
news:0bd36331-f550-4e8d...@googlegroups.com...

snip

https://www.linkedin.com/pulse/application-iec-62304-amendment-1-2015-europe-georg-heidenreich





Quote

- everything being executed on a PROCESSOR will be considered
SOFTWARE and therefore be under IEC 62304, including software to be
executed on FPGA-processors, signal processors and graphics
boards.

Exactly. There are no processors or software in the FPGA.

This is sophistry.

Only to those who know as little about FPGAs as you do.

Field Programmable Gate Arrays have to be programmed before they can
do anything, and the software that programs them is an essential part
of the design.

In spite of the name, FPGAs are not programmed, they are configured.
They require no software to configure them since they are capable of
loading their own configuration either from on chip Flash, non-erasable
memory or external flash.


More importantly, what you are failing to acknowledge is that not
every aspect of the device has to be dealt with in the same way. Not
every aspect of the design is safety critical. A risk assessment is
part of the process.

And if you think that the string of text that programs the FPGA isn\'t
software, you don\'t seem to be in close enough touch with reality to be
a particularly reliable risk assessor.

The schematic I use to design my PCBs are just text as well. Yet no one
claims the board of analog circuitry is a processor.

That\'s the issue. There is no processor in an FPGA unless the user
designs one for it.

This isn\'t rocket science. A processor has a definition. Look it up.
If the FPGA is a processor, it\'s not very Turing complete. While an FPGA
can be configured to be a processor, very few of them are. If they
aren\'t configured to be a processor, they aren\'t processors.

Please stop being silly about this.

This is a neat and commercially important example of your limited
imagination and understanding; The HP 94332 Nanoprocessor.

Given its characteristics, do you think it is a processor or not?

16 8-bit registers Vectored interrupts 700ns Instruction cycle time 400ns
42 8-bit instructions 11 bit PC but no ALU no arithmetic other than
increment/decrement

\"The Nanoprocessor was more of a state-machine controller than a
microprocessor.\"

\"This code is from the interrupt handler that increases the time and date
every second. The code below determines ... the corresponding opcode, and
my description of the instruction. ... This code demonstrates that even
though a processor without addition sounds useless, the Nanoprocessor\'s bit
operations and increment/decrement allow more computation than you\'d
expect. It also shows that Nanoprocessor code is compact and efficient.
Many things can be done in a single byte (such as bit test and skip) that
would take multiple bytes on other processors. The Nanoprocessor\'s large
register file also avoids much of the tedious shuffling of data back and
forth often required in other processors. Although some call the
Nanoprocessor more of a state machine controller than a microprocessor,
that understates the capabilities and role of the Nanoprocessor.\"

http://www.cpushack.com/2020/08/09/the-forgotten-ones-hp-nanoprocessor/
http://www.righto.com/2020/09/inside-hp-nanoprocessor-high-speed.html

So we are attempting to define \"processor\" by example? Yes, I consider this
to be a processor because it executes a program stored in memory, generated
from code whose use is only to specify the instructions.

But this is not about what I consider to be a processor. The issue is what
does the standard consider to be a processor? No?

I suppose the standard is flexible enough to give you the rope to hang
yourself. It sounds like a lot of adherence to the spec requires the
developers to make their own definitions and to justify those definitions.

A lot of people are criticizing this project as being unrealistic in
consideration of the difficulties of getting medical equipment approved. I
don\'t disagree with that assessment. But that doesn\'t mean I should abandon
the effort rather than putting time and interest into doing as good a job as
I can. 6 weeks ago I was the guy trying to get them to write requirements
for the hardware in addition to the software in the face of \"we don\'t have
time to do it right\" sort of severe resistance.

At this point I am doing most of the electronic design and am making a
requirements document for the FPGA which I won\'t consider coding until it is
complete and the stake holders have signed off on. The project leader does
not understand the issue of measuring twice so we can cut once. You
literally would not believe the number and severity of issues with the first
revs of the control board.

I was about to exit the project because I thought my role was about done.
Now it looks like I\'m the lead hardware designer and my biggest problem will
be debugging the thing without equipment. At least I have a good
relationship with the software guys. If I get my hands on a machine I\'ll be
installing a Forth interpreter so I can get it to do what I need.

Right now I\'m trying to figure out how to calibrate the sensors in the use
cases for the machine. I think it will need two calibration modes. One to
calibrate the offset of the various sensors which can be one without special
equipment. Another calibration mode will be required to fully calibrate the
O2 sensor at normal O2 content and 100% O2. This requires the use of an O2
supply, so it is a separate calibration. It\'s required because the O2 sensor
has a limited lifetime and fails over a 2 to 4 week period. So this
calibration needs to be done at least every 2 weeks, really every week. The
other cal can be done more easily and only takes a minute, so should be done
every day.

You really should consider taking up Piotr\'s/Anthony\'s/my suggestion of
approaching the Approval Body.

But hey, knock yourself out!
 

Welcome to EDABoard.com

Sponsor

Back
Top