Curious learner: a generative neuro-symbolic approach for function execution & illustration using natural language
View/ Open
Date
2024-02Publisher
Brac UniversityAuthor
Joaa, A.F.M. MohimenulMetadata
Show full item recordAbstract
Generative models possess immense potential, but their ability to perform complex
calculations is limited by the need to memorize vast amounts of data, leading to
computational inefficiencies. Leveraging tools like the Arithmetic Logic Unit using
symbolic functions offers a more efficient alternative, enabling faster responses,
smaller model sizes, and improved accuracy. We propose a neuro-symbolic generative
model to empower natural language models with task execution abilities by integrating
functional programming principles. Experiments on our scoped four translation
tasks using 98 mathematical functions demonstrated rapid convergence and minimal
training time requirements. Our model, containing 111 million trainable parameters,
achieved an average accuracy, BLEU score, and perplexity score of 0.85, 0.84,
and 5.9, respectively, after training on a T4 GPU for several hours. This neurosymbolic
Language Model shows significant potential for various applications, such
as NLP-based command line tools, customer service automation, service discovery
automation, project code automation, and natural language-based operating systems.