D
dunda
Guest
I need to take an 8-bit binary input and display it on an LCD display
using VHDL. The problem is that the LCD only takes one ASCII character
at a time and displays them in sequence. How do I take the 8-bit binary
number and send individual characters to the LCD? I tried converting
the 8bit number into decimal and then sending the individual characters
back in binary. But I cant get the conversion right. My conversion code
is below:
library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_arith.all;
--use ieee.std_logic_unsigned.all; -- these libraries conflict with
ieee.std_logic_arith.all !!!!!
--use ieee.numeric_std.all; -- these libraries conflict
with ieee.std_logic_arith.all !!!!!
entity bintodec is
PORT(
clock:in std_logic;
dip1: in std_logic_vector(7 downto 0);
output: out std_logic_vector(7 downto 0) ---using to test the outputs
);
end bintodec;
ARCHITECTURE arch of bintodec is
signal var2: integer range 0 to 255:=8;
signal var: integer;
signal varout: std_logic_vector (7 downto 0);
signal dip1_un: unsigned (7 downto 0);
signal sumof2: integer;
signal unsnd: unsigned (7 downto 0);
BEGIN
process(clock)
begin
for i in 0 to 7 loop
if dip1(i) = '1' then
sumof2<=1;
for j in 0 to i loop
sumof2<=sumof2*2;
end loop;
var<=var+sumof2;
end if;
end loop;
unsnd <= conv_UNSIGNED (var2, 8); -- DOESNT WORK!!
output <= std_logic_vector( to_unsigned( unsnd, 8 )); -- COMPILES BUT
NO OUTPUT !!
What am i doing wrong?
Thanks
using VHDL. The problem is that the LCD only takes one ASCII character
at a time and displays them in sequence. How do I take the 8-bit binary
number and send individual characters to the LCD? I tried converting
the 8bit number into decimal and then sending the individual characters
back in binary. But I cant get the conversion right. My conversion code
is below:
library ieee;
use ieee.std_logic_1164.all;
use ieee.std_logic_arith.all;
--use ieee.std_logic_unsigned.all; -- these libraries conflict with
ieee.std_logic_arith.all !!!!!
--use ieee.numeric_std.all; -- these libraries conflict
with ieee.std_logic_arith.all !!!!!
entity bintodec is
PORT(
clock:in std_logic;
dip1: in std_logic_vector(7 downto 0);
output: out std_logic_vector(7 downto 0) ---using to test the outputs
);
end bintodec;
ARCHITECTURE arch of bintodec is
signal var2: integer range 0 to 255:=8;
signal var: integer;
signal varout: std_logic_vector (7 downto 0);
signal dip1_un: unsigned (7 downto 0);
signal sumof2: integer;
signal unsnd: unsigned (7 downto 0);
BEGIN
process(clock)
begin
for i in 0 to 7 loop
if dip1(i) = '1' then
sumof2<=1;
for j in 0 to i loop
sumof2<=sumof2*2;
end loop;
var<=var+sumof2;
end if;
end loop;
unsnd <= conv_UNSIGNED (var2, 8); -- DOESNT WORK!!
output <= std_logic_vector( to_unsigned( unsnd, 8 )); -- COMPILES BUT
NO OUTPUT !!
What am i doing wrong?
Thanks