Skip to main navigation Skip to search Skip to main content

Voice Recognition Acquisition and Storage System using VHDL and the DE1-SoC FPGA

  • Elmer Arellanos
  • , Luis Dominguez Remigio
  • , Jose Luis Ostos Marquez
  • , Moises Nunez

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper presents the design of a voice acqui-sition and recognition system in VHDL, mainly optimized for FPGA DE1-SoC. The system implements advanced digital signal processing techniques combined with classification through the K-Nearest Neighbors (KNN) algorithm. The system operates at a 320 kHz sampling rate, with a latency of 10 ns and power consumption of 1.2 W. The design enhances precision and noise rejection through the use of Hamming windows and integer-based processing. Future improvements will focus on the implementation of neural networks to replace the KNN algorithm and further increase system accuracy.

Original languageEnglish
Title of host publicationProceedings of the 2024 IEEE 31st International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350378344
DOIs
StatePublished - 2024
Event31st IEEE International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2024 - Lima, Peru
Duration: 6 Nov 20248 Nov 2024

Publication series

NameProceedings of the 2024 IEEE 31st International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2024

Conference

Conference31st IEEE International Conference on Electronics, Electrical Engineering and Computing, INTERCON 2024
Country/TerritoryPeru
CityLima
Period6/11/248/11/24

Keywords

  • DE1-SoC
  • Digital Signal Processing
  • FPGA
  • K-Nearest Neighbors (KNN)
  • VHDL
  • Voice Recognition

Fingerprint

Dive into the research topics of 'Voice Recognition Acquisition and Storage System using VHDL and the DE1-SoC FPGA'. Together they form a unique fingerprint.

Cite this