Abstract

    Open Access Review Article Article ID: TCSIT-8-170

    Neurological Properties to Circumvent AI’s Error Reduction Impasse

    Thaddeus JA Kobylarz* and Erik J Kobylarz

    Our paper proposes significant changes to AI technology. We believe this is necessary because current implementations have stagnated at average error rates of approximately 8%. Implementers hope that further improvements will lower error rates to 5% by 2025. This would require 1028 floating-point operations, which is not possible with today’s algorithms and computer technology. Even errors of 5% are excessive for many practical applications.

    The current AI implementations have ignominious errors. Near bankruptcy of a prominent real estate corporation, and the obligatory resignation of an elected government official resulted from AI errors. The causation errors were ludicrous and unlikely performed by humans. Applications of AI are therefore limited to those for which errors are nugatory.

    In contrast, the human brain’s capabilities and efficiency are astonishing. In significant contrast to current AI models, the human brain is impressive in terms of its relatively small size (adult average 79 in3), weight (approximately 4#), and power consumption (nominally 15W). We feel that this implies that AI technology needs to adopt excluded neurological properties. 

    The current AI neuron model is an overly simplified linear model, which was proposed about 70 years ago. We propose emulating the neurological neuron’s nonlinear capabilities. The versatility of the improved AI model would be many orders of magnitude beyond that of the currently implemented linear neuron models.

    Also, the proposed neurological properties are of neural plasticity. Specifically, we describe the neurological associative learning aspect of neuroplasticity, partitioning associative plasticity into “inter-association” (neural network structure), and “intra-association” (neuron functioning). 

    Keywords:

    Published on: Sep 21, 2023 Pages: 61-72

    Full Text PDF Full Text HTML DOI: 10.17352/tcsit.000070
    CrossMark Publons Harvard Library HOLLIS Search IT Semantic Scholar Get Citation Base Search Scilit OAI-PMH ResearchGate Academic Microsoft GrowKudos Universite de Paris UW Libraries SJSU King Library SJSU King Library NUS Library McGill DET KGL BIBLiOTEK JCU Discovery Universidad De Lima WorldCat VU on WorldCat

    Indexing/Archiving

    Pinterest on TCSIT