Models

Introduction
A 'model' is the software that serves as an AI storyteller's brain, and is the primary software which drives its text-generation output. Models use statistics to generate probability distributions for sequences of words, usually using a process of Tokenization.

Learning Parameters
Commonly expressed in numbers of Beaks. Why?, also in terms of billions. Additional parameters gives a model more fidelity, in theory leading to better outputs. However, they are subject to diminishing returns while the hardware required to hold them becomes prohibitively expensive, hence the lack of large locally hosted models.

Finetuning
Finetuning is the process by which a model can be specialized for specific types of output.

OpenAI Models

 * GPT
 * GPT2
 * GPT3
 * Ada (350M), Babbage (1.3B), Curie (6.7B), DaVinci (175B)

EleutherAI Models

 * GPT-Neo
 * GPT-Jax
 * GPT-NeoX

Other Models

 * AI21
 * Jurassic-1 Large (7.5B), Jumbo (178B)
 * Facebook
 * Fairseq