![](images/logo.png?v=2)
International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 6 Issue 4
July-August 2024
Indexing Partners
![Academia.edu Academia](images/index-partners/academia.png)
![Advanced Sciences Index Advanced Sciences Index](images/index-partners/advanced-sciences.png)
![Bielefeld Academic Search Engine Bielefeld Academic Search Engine](images/index-partners/bielefeld.gif)
![CiteSeer CiteSeer](images/index-partners/cite-seer.png)
![DRJI DRJI](images/index-partners/drji.png)
![Google Scholar Google Scholar](images/index-partners/google-scholar.png)
![Independent Search Engine & Directory Network (isedn.org) Independent Search Engine & Directory Network](images/index-partners/isedn.jpg)
![ISI (International Scientific Indexing) ISI (International Scientific Indexing)](images/index-partners/isi.png)
![Issuu Issuu](images/index-partners/issuu.png)
![Mendeley Research Networks Mendeley Research Networks](images/index-partners/mendeley.png)
![RefSeek RefSeek](images/index-partners/ref-seek.png)
![ResearcherId - Thomson Reuters ResearcherId - Thomson Reuters](images/index-partners/researcher-id.png)
![ResearchGate ResearchGate](images/index-partners/research-gate.png)
![Scirus Scirus](images/index-partners/scirus.png)
![Scribd Scribd](images/index-partners/scribd.gif)
![Semantic Scholar Semantic Scholar](images/index-partners/semantic-scholar.png)
![UTeM - Universiti Teknikal Malaysia Melaka UTeM - Universiti Teknikal Malaysia Melaka](images/index-partners/utem.png)
![Wiki for Call for Papers Wiki for Call for Papers](images/index-partners/wiki-cfp.png)
![WorldCat WorldCat](images/index-partners/world-cat.png)
Various Ways to Recall Cross-modality Perceptual Information in Cognitive Machine
Author(s) | Eik Fun Khor |
---|---|
Country | Singapore |
Abstract | As human, we actively stimulate our memory via variety of recalls for information across different modalities in our brains. The variety of recalls enriches our crossmodal recall mechanism and equip our cognitive system to retrieve information for different needs. Inspired by our cognitive capabilities, this paper formulates, implement and examine different ways in performing crossmodal associative recall of perceptual information in machine cognition. Simulations with different cases and scenarios are conducted and examined on the recalls. Results show that different recalls possess their own features and properties which are useful to equip cognitive machine with variety of cognitive recall capabilities for different purposes. |
Keywords | Machine Perception, Machine Cognition, Associative Memory, Information Retrieval, Associative Recall, Multimodal Perception, Computational Model, Cognitive System |
Field | Computer > Automation / Robotics |
Published In | Volume 5, Issue 2, March-April 2023 |
Published On | 2023-03-17 |
Cite This | Various Ways to Recall Cross-modality Perceptual Information in Cognitive Machine - Eik Fun Khor - IJFMR Volume 5, Issue 2, March-April 2023. DOI 10.36948/ijfmr.2023.v05i02.1899 |
DOI | https://doi.org/10.36948/ijfmr.2023.v05i02.1899 |
Short DOI | https://doi.org/gr2kpr |
Share this
E-ISSN 2582-2160
![](images/ean-13-barcode.gif)
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
![](images/loading.gif)