top of page

Definitions

AI Attention - A mechanism that allows neural networks to learn long-range dependencies in text. Attention is a key component of many natural language processing models, such as BERT and RoBERTa. AI self-attention is a type of attention mechanism that allows a neural network to attend to its own inputs. Self-attention is a key component of many natural language processing models, such as GPT and Transformer-XL.

​

AI Encoder - A part of a neural network that is responsible for encoding the input into a representation that can be used by the decoder. Encoders are typically used in natural language processing models for tasks such as machine translation and text summarization.

​

AI Decoder - A part of a neural network that is responsible for decoding the representation into the output. Decoders are typically used in natural language processing models for tasks such as machine translation and text summarization.

 

AI Pretraining - A process of training a neural network on a large dataset of text. Pretraining helps neural networks to learn general knowledge about language, which can then be used to improve performance on a variety of natural language processing tasks.

​

AI transformer - A type of neural network architecture that is used for natural language processing tasks.Transformers are able to learn long-range dependencies in text, which makes them well-suited for tasks such as machine translation and text summarization.

​

BERT is a bidirectional encoder representation transformer that has been shown to be very effective for a variety of natural language processing tasks. A neural network architecture that can learn long-range dependencies in text.

​

CNN - A convolutional neural network (CNN) that is used for natural language processing tasks.CNNs are able to learn local patterns in text, which makes them well-suited for tasks such as text classification and sentiment analysis.

​

Dialated Attension - Microsoft development in July 5, 2023 developed the ability to zoom in and out of an image of a block of data using one billion tokens. Treats large data blocks like a diffusion model. This is about as much data as a human reads in his lifetime.

​

Generative Adversarial Network(GAN):
A type of computer program that creates new things, such as images or music, by training two neural networks against each other. One network, called the generator, creates new data, while the other network, called the discriminator, checks the authenticity of the data. The generator learns to improve its data generation through feedback from the discriminator, which becomes better at identifying fake data. This back and forth process continues until the generator is able to create data that is almost impossible for the discriminator to tell apart from real data. 

​

Generative Art:
Generative art is a form of art that is created using a computer program or algorithm to generate visual or audio output. It often involves the use of randomness or mathematical rules to create unique, unpredictable, and sometimes chaotic results.

​

GPT - A large language model (LLM) developed by OpenAI. GPT can be used for a variety of tasks, such as text generation, translation, and question answering.

​

Hugging Faces Library - A Python library that provides a high-level API for working with a variety of natural language processing models, including GPT, BERT, and RoBERTa. The Hugging Faces library makes it easy to load, train, and evaluate natural language processing models.

​

Hugging Faces Hub - A repository that hosts pre-trained natural language processing models from the Hugging Faces library. The Hugging Faces Hub makes it easy to find and use pre-trained models for a variety of natural language processing tasks.

​

Instruction Fitting - GPT 4.0 learned how to solve problems of all kinds from a set of instructions and guides.

​

Large Language Models - A type of artificial intelligence (AI) algorithm that uses deep learning techniques and massively large data sets to understand, summarize, generate, and predict new content. LLM starts with neural nets that relate words to each other, and through weighting functions, they show which words have a stronger relationship in a given context. They are trained on massive amounts of data to produce neural nets that attempt to predict the next word in a sentence. Through concepts such as attention, reinforcement, and transformers, these neural nets become highly accurate at predicting the next word for a given sentence. For example, in 2022, the Google LLM Palm used 5*E+11 parameters for training and achieved one of the top performances in AI at the time.

​

Machine Learning (ML):
A method of teaching computers to learn from data, without being explicitly programmed.

 

Natural Language Processing (NLP):
A subfield of AI that focuses on teaching machines to understand, process, and generate human language

 

Neural Nets - A type of machine learning model that is inspired by the human brain. Neural nets are able to learn complex patterns in data, which makes them well-suited for a variety of tasks, including natural language processing, computer vision, and deep reinforcement learning.

​

Neural Radiance Fields (NeRF):
Neural Radiance Fields are a type of deep learning model that can be used for a variety of tasks, including image generation, object detection, and segmentation. NeRFs are inspired by the idea of using a neural network to model the radiance of an image, which is a measure of the amount of light that is emitted or reflected by an object.

​

OpenAI:
OpenAI is a research institute focused on developing and promoting artificial intelligence technologies that are safe, transparent, and beneficial to society

​

Prompt  - In 2021, a big break through for GPT occcurred when it was trained on a data set of questions and answers. Then GPT became a LLM that could understand a question prompt and give responabe answers. Promt design and engineering is now a vital part of successfully using this technology. 

​

PyTorch - An open-source machine learning library that is used for a variety of tasks, including natural language processing, computer vision, and deep reinforcement learning. PyTorch is a popular choice for natural language processing because it is easy to use and has a large community of developers.

​

RNN - A recurrent neural network (RNN) that is used for natural language processing tasks. RNNs are able to learn long-range dependencies in text, which makes them well-suited for tasks such as machine translation and text summarization.

​

TensorFlow - An open-source machine learning library that is developed by Google. TensorFlow is a popular choice for natural language processing because it is scalable and efficient.

bottom of page