help -- binary to LCD display

D

dunda

Guest
I need to take an 8-bit binary input and display it on an LCD display
using VHDL. The problem is that the LCD only takes one ASCII character
at a time and displays them in sequence. How do I take the 8-bit binary
number and send individual characters to the LCD? I tried converting
the 8bit number into decimal and then sending the individual characters
back in binary. But I cant get the conversion right. My conversion code
is below:

library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_arith.all;
--use ieee.std_logic_unsigned.all; -- these libraries conflict with
ieee.std_logic_arith.all !!!!!
--use ieee.numeric_std.all; -- these libraries conflict
with ieee.std_logic_arith.all !!!!!

entity bintodec is
PORT(
clock:in std_logic;
dip1: in std_logic_vector(7 downto 0);
output: out std_logic_vector(7 downto 0) ---using to test the outputs
);

end bintodec;

ARCHITECTURE arch of bintodec is

signal var2: integer range 0 to 255:=8;
signal var: integer;
signal varout: std_logic_vector (7 downto 0);
signal dip1_un: unsigned (7 downto 0);
signal sumof2: integer;
signal unsnd: unsigned (7 downto 0);
BEGIN

process(clock)

begin
for i in 0 to 7 loop

if dip1(i) = '1' then
sumof2<=1;
for j in 0 to i loop
sumof2<=sumof2*2;
end loop;
var<=var+sumof2;
end if;
end loop;

unsnd <= conv_UNSIGNED (var2, 8); -- DOESNT WORK!!

output <= std_logic_vector( to_unsigned( unsnd, 8 )); -- COMPILES BUT
NO OUTPUT !!

What am i doing wrong?
Thanks
 
dunda wrote:
I need to take an 8-bit binary input and display it on an LCD display
using VHDL. The problem is that the LCD only takes one ASCII character
at a time and displays them in sequence. How do I take the 8-bit binary
number and send individual characters to the LCD? I tried converting
the 8bit number into decimal and then sending the individual characters
back in binary. But I cant get the conversion right. My conversion code
is below:

library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_arith.all;
--use ieee.std_logic_unsigned.all; -- these libraries conflict with
ieee.std_logic_arith.all !!!!!
--use ieee.numeric_std.all; -- these libraries conflict
with ieee.std_logic_arith.all !!!!!
Never use std_logic_arith and std_logic_(un)signed. Use only
numeric_std and you'll solve the "conflicting libraries" problem.
Blame your FPGA vendor for not fixing their docs and their
code-generation tools.

entity bintodec is
PORT(
clock:in std_logic;
dip1: in std_logic_vector(7 downto 0);
output: out std_logic_vector(7 downto 0) ---using to test the outputs
);

end bintodec;

ARCHITECTURE arch of bintodec is

signal var2: integer range 0 to 255:=8;
signal var: integer;
signal varout: std_logic_vector (7 downto 0);
signal dip1_un: unsigned (7 downto 0);
signal sumof2: integer;
signal unsnd: unsigned (7 downto 0);
BEGIN

process(clock)

begin
for i in 0 to 7 loop

if dip1(i) = '1' then
sumof2<=1;
for j in 0 to i loop
sumof2<=sumof2*2;
end loop;
var<=var+sumof2;
end if;
end loop;

unsnd <= conv_UNSIGNED (var2, 8); -- DOESNT WORK!!

output <= std_logic_vector( to_unsigned( unsnd, 8 )); -- COMPILES BUT
NO OUTPUT !!
you should use variables instead of signals for the intermediate stuff
in your for loop.

=-a
 
I cannot see where you use the clock ?? You just have it in your
sensitivity list ...
Apart from that, first try to understand how a display works ...

Rgds
André
 
dunda wrote:
I need to take an 8-bit binary input and display it on an LCD display
using VHDL. The problem is that the LCD only takes one ASCII character
at a time and displays them in sequence. How do I take the 8-bit binary
number and send individual characters to the LCD? I tried converting
the 8bit number into decimal and then sending the individual characters
back in binary. But I cant get the conversion right. My conversion code
is below:

library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_arith.all;
--use ieee.std_logic_unsigned.all; -- these libraries conflict with
ieee.std_logic_arith.all !!!!!
--use ieee.numeric_std.all; -- these libraries conflict
with ieee.std_logic_arith.all !!!!!

entity bintodec is
PORT(
clock:in std_logic;
dip1: in std_logic_vector(7 downto 0);
output: out std_logic_vector(7 downto 0) ---using to test the outputs
);

end bintodec;

ARCHITECTURE arch of bintodec is

signal var2: integer range 0 to 255:=8;
signal var: integer;
signal varout: std_logic_vector (7 downto 0);
signal dip1_un: unsigned (7 downto 0);
signal sumof2: integer;
signal unsnd: unsigned (7 downto 0);
BEGIN

process(clock)

begin
for i in 0 to 7 loop

if dip1(i) = '1' then
sumof2<=1;
for j in 0 to i loop
sumof2<=sumof2*2;
end loop;
var<=var+sumof2;
end if;
end loop;

unsnd <= conv_UNSIGNED (var2, 8); -- DOESNT WORK!!

output <= std_logic_vector( to_unsigned( unsnd, 8 )); -- COMPILES BUT
NO OUTPUT !!

What am i doing wrong?
Thanks
Aside from what the others have mentioned, I don't see that var2 is
assigned a value, except for "signal var2.. :=8", and I'm not sure if
that gets synthesized or not. Is dip1 a signed or unsigned value?
When you convert dip1 to be displayed, are you going to display it as 2
hex digits (00h - FFh) or decimal digits (-1, 255, etc). Your output
port "output" will only hold a single ASCII character. Does your LCD
have a serial or parallel interface? This affects how you send ASCII
characters to the LCD.

-Dave Pollum
 

Welcome to EDABoard.com

Sponsor

Back
Top