{ "cells": [ { "cell_type": "markdown", "source": [ "### Character-Level Language Model\n", "\n", "This notebook contains a generative model working at the level of characters.\n" ], "metadata": { "id": "sZ6I5Yr8QUtD" } }, { "cell_type": "code", "execution_count": 48, "metadata": { "id": "RqPkh2KTGUGp" }, "outputs": [], "source": [ "import numpy as np\n", "from numpy.random import randint,rand,seed,normal,permutation,choice\n", "\n", "import string\n", "import math\n", "\n", "import matplotlib.pyplot as plt\n", "from copy import deepcopy\n", "from tqdm import tqdm\n", "\n", "import torch\n", "from torch import nn, optim\n", "import torch.nn.functional as F\n", "from torch.utils.data import random_split,Dataset,DataLoader\n", "\n", "\n", "\n", "# from torchsummary import summary # must install using pip install torchsummary" ] }, { "cell_type": "code", "execution_count": 49, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "RcDYxJGhGUGr", "outputId": "9be52dba-3725-4654-f53b-46316d0de266" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n" ] } ], "source": [ "from google.colab import drive\n", "drive.mount('/content/drive')\n", "\n", "data_dir = 'drive/MyDrive/CS505 Datasets/'" ] }, { "cell_type": "markdown", "source": [ "Load a text file. We chose a poem, to see how it did with line breaks. " ], "metadata": { "id": "FbYPF_0RQw1j" } }, { "cell_type": "code", "execution_count": 50, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "id": "2_zGTaCKGUGs", "outputId": "46973f9f-1845-40ff-8287-b68b4eb8556a" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "\"Of Man's first disobedience, and the fruit\\nOf that forbidden tree whose mortal taste\\nBrought death i\"" ], "application/vnd.google.colaboratory.intrinsic+json": { "type": "string" } }, "metadata": {}, "execution_count": 50 } ], "source": [ "\n", "with open(data_dir+\"Milton_Paradise_Lost.txt\", \"r\") as text_file:\n", " text = text_file.read()\n", "\n", "text[:100]" ] }, { "cell_type": "markdown", "source": [ "No normalization will be performed, however,\n", "we will run out of RAM if we attempt to\n", "use the entire poem as data. We have chosen\n", "here to use 10K characters, out of a total\n", "of" ], "metadata": { "id": "jvdwU6PeQugU" } }, { "cell_type": "code", "execution_count": 51, "metadata": { "id": "lVfP44ZUGUGs", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4f393b80-cd76-4f02-a3e7-1c9c5e21ae21" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Text is 456475 characters long.\n" ] } ], "source": [ "print(f\"Text is {len(text)} characters long.\")\n", "\n", "size = 10000\n", "\n", "text = text[:size]" ] }, { "cell_type": "markdown", "source": [ "Next we figure out how many distinct characters there are in the text; this\n", "will be what is generated at each step of the generation." ], "metadata": { "id": "rwgjqUe5R0eX" } }, { "cell_type": "code", "execution_count": 52, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "aYeHZcdzGUGt", "outputId": "ef380ca5-16ff-467a-fc86-5f3719f42b61" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "There are 62 characters in the text.\n", "Character set: ['\\n', ' ', '!', '\"', \"'\", '(', ')', ',', '-', '.', ':', ';', '?', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'R', 'S', 'T', 'U', 'V', 'W', 'Y', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z'].\n" ] } ], "source": [ "chars_in_text = sorted(list(set(text)))\n", "\n", "num_chars = len(chars_in_text)\n", "\n", "print(f'There are {num_chars} characters in the text.')\n", "\n", "\n", "print(f'Character set: {chars_in_text}.')\n" ] }, { "cell_type": "code", "source": [ "# Create functions mapping characters to integers and back\n", "\n", "def char2int(c):\n", " return chars_in_text.index(c)\n", "\n", "def int2char(i):\n", " return chars_in_text[i]" ], "metadata": { "id": "gwALsQhiStNT" }, "execution_count": 53, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "68JWvJxdGUGt" }, "source": [ "As we're going to predict the next character in the sequence at each time step, we'll have to divide each sentence into\n", "\n", "- Input data\n", " - The last input character should be excluded as it does not need to be fed into the model\n", "- Target/Ground Truth Label\n", " - One time-step ahead of the Input data as this will be the \"correct answer\" for the model at each time step corresponding to the input data\n", "\n", "The sample length is a critical parameter which tells us how much of the source data to ingest at each training step. You might want to play around with this as one of the hyperparameters." ] }, { "cell_type": "code", "execution_count": 54, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "lYyi7B_IGUGu", "outputId": "ee9dab6f-b29d-41e5-a2be-78a32eb6f011" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Input sequence:\n", "Of Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death \n", "Target sequence:\n", "f Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death i\n", "\n", "Input sequence:\n", "f Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death i\n", "Target sequence:\n", " Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death in\n", "\n", "Input sequence:\n", " Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death in\n", "Target sequence:\n", "Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death int\n", "\n", "Input sequence:\n", "Man's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death int\n", "Target sequence:\n", "an's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death into\n", "\n", "Input sequence:\n", "an's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death into\n", "Target sequence:\n", "n's first disobedience, and the fruit\n", "Of that forbidden tree whose mortal taste\n", "Brought death into \n", "\n" ] } ], "source": [ "sample_len = 100\n", "\n", "# Creating lists that will hold our input and target sample sequences\n", "\n", "input_seq_chars = []\n", "target_seq_chars = []\n", "\n", "for k in range(len(text)-sample_len+1):\n", "\n", " # Remove last character for input sequence\n", " input_seq_chars.append(text[k:k+sample_len-1])\n", "\n", " # Remove firsts character for target sequence\n", " target_seq_chars.append(text[k+1:k+sample_len])\n", "\n", "for i in range(5):\n", " print(f'Input sequence:\\n{input_seq_chars[i]}')\n", " print(f'Target sequence:\\n{target_seq_chars[i]}')\n", " print()\n" ] }, { "cell_type": "markdown", "metadata": { "id": "DYT6SCJ9GUGu" }, "source": [ "Now we can convert our input and target sequences to sequences of integers instead of characters by mapping them using the functions we created above. This will allow us to one-hot-encode our input sequence later." ] }, { "cell_type": "code", "execution_count": 55, "metadata": { "id": "WJXQF5NAGUGu", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "74125302-b170-4b2c-d9f8-97076382e5b2" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "[27, 41, 1, 25, 36, 49, 4, 54, 1, 41, 44, 53, 54, 55, 1, 39, 44, 54, 50, 37, 40, 39, 44, 40, 49, 38, 40, 7, 1, 36, 49, 39, 1, 55, 43, 40, 1, 41, 53, 56, 44, 55, 0, 27, 41, 1, 55, 43, 36, 55, 1, 41, 50, 53, 37, 44, 39, 39, 40, 49, 1, 55, 53, 40, 40, 1, 58, 43, 50, 54, 40, 1, 48, 50, 53, 55, 36, 47, 1, 55, 36, 54, 55, 40, 0, 14, 53, 50, 56, 42, 43, 55, 1, 39, 40, 36, 55, 43, 1]\n" ] } ], "source": [ "input_seq = []\n", "target_seq = []\n", "\n", "for i in range(len(input_seq_chars)):\n", " input_seq.append( [char2int(ch) for ch in input_seq_chars[i]])\n", " target_seq.append([char2int(ch) for ch in target_seq_chars[i]])\n", "\n", "print(input_seq[0])" ] }, { "cell_type": "code", "execution_count": 56, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "3Q5xPFM0GUGu", "outputId": "cd14643b-f18b-4397-9541-1fceb596c4f9" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "array([[0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.]])" ] }, "metadata": {}, "execution_count": 56 } ], "source": [ "# convert an integer into a one-hot encoding of the given size (= number of characters)\n", "def int2OneHot(X,size):\n", "\n", " def int2OneHot1(x,size=10):\n", " tmp = np.zeros(size)\n", " tmp[int(x)] = 1.0\n", " return tmp\n", "\n", " return np.array([ int2OneHot1(x, size) for x in X ]).astype('double')\n", "\n", "int2OneHot( np.array([ 2,3,1,2,3,4 ]),10)" ] }, { "cell_type": "code", "execution_count": 57, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "0v3RGM8AGUGv", "outputId": "55840113-66ca-4b5f-e6b5-64d8c49c16cf" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "array([[[0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.]],\n", "\n", " [[0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.]],\n", "\n", " [[0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n", " [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.]]])" ] }, "metadata": {}, "execution_count": 57 } ], "source": [ "# do the same thing, but for a list/array of integers\n", "\n", "def seq2OneHot(seq,size):\n", " return np.array([ int2OneHot(x, size) for x in seq ])\n", "\n", "seq2OneHot( np.array([[ 2,3,1,2,3,4 ],[ 2,3,1,2,3,4 ],[ 2,3,1,2,3,4 ]]),10)" ] }, { "cell_type": "code", "execution_count": 58, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "nbTrt7AjGUGv", "outputId": "3d839dbb-597a-44c8-c81f-e89a9b55a93b" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "(9901, 99, 62)" ] }, "metadata": {}, "execution_count": 58 } ], "source": [ "# Convert our input sequences to one-hot form\n", "\n", "input_seq = seq2OneHot(input_seq,size=num_chars)\n", "input_seq.shape" ] }, { "cell_type": "code", "execution_count": 59, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "B1KM2AaXGUGv", "outputId": "b7532041-b4a6-4a14-b026-39b090383606" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "(9901, 99, 62)" ] }, "metadata": {}, "execution_count": 59 } ], "source": [ "# Convert our target sequences to one-hot form\n", "\n", "target_seq = seq2OneHot(target_seq,size=num_chars)\n", "target_seq.shape" ] }, { "cell_type": "markdown", "metadata": { "id": "YEYJ0NE0GUGv" }, "source": [ "Since we're done with all the data pre-processing, we can now move the data from numpy arrays to tensors." ] }, { "cell_type": "code", "execution_count": 60, "metadata": { "id": "CcrO5N7RGUGv" }, "outputs": [], "source": [ "input_seq = torch.Tensor(input_seq).type(torch.DoubleTensor)\n", "target_seq = torch.Tensor(target_seq).type(torch.DoubleTensor)" ] }, { "cell_type": "markdown", "metadata": { "id": "D5NEt539GUGv" }, "source": [ "Now we will build a data loader to manage the batching." ] }, { "cell_type": "code", "source": [ "class Basic_Dataset(Dataset):\n", "\n", " def __init__(self, X,Y):\n", " self.X = X\n", " self.Y = Y\n", "\n", " def __len__(self):\n", " return len(self.X)\n", "\n", " # return a pair x,y at the index idx in the data set\n", " def __getitem__(self, idx):\n", " return self.X[idx], self.Y[idx]\n", "\n", "ds = Basic_Dataset(input_seq,target_seq)\n", "\n", "ds.__len__()" ], "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "hhkkAfJVVViF", "outputId": "165c3008-fcc8-42b6-901a-805f35fb46c7" }, "execution_count": 61, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "9901" ] }, "metadata": {}, "execution_count": 61 } ] }, { "cell_type": "markdown", "source": [ "Batch size is a hyperparameter that will mostly determine how efficiently you can process the data on a GPU." ], "metadata": { "id": "8FPjzvUbVrAd" } }, { "cell_type": "code", "execution_count": 62, "metadata": { "id": "v4vgBd0jGUGv" }, "outputs": [], "source": [ "batch_size = 128\n", "\n", "data_loader = DataLoader(ds, batch_size=batch_size, shuffle=True)\n" ] }, { "cell_type": "markdown", "metadata": { "id": "_wgOsxHYGUGv" }, "source": [ "Check if a GPU is available and use it if it is." ] }, { "cell_type": "code", "execution_count": 63, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "BFwBQZ1uGUGv", "outputId": "3242cf1c-56d9-4600-922c-91e3248a86ac" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "GPU is available\n" ] } ], "source": [ "# torch.cuda.is_available() checks and returns a Boolean True if a GPU is available, else it'll return False\n", "is_cuda = torch.cuda.is_available()\n", "\n", "# If we have a GPU available, we'll set our device to GPU. We'll use this device variable later in our code.\n", "if is_cuda:\n", " device = torch.device(\"cuda\")\n", " print(\"GPU is available\")\n", "else:\n", " device = torch.device(\"cpu\")\n", " print(\"GPU not available, CPU used\")" ] }, { "cell_type": "markdown", "metadata": { "id": "R6u-fhYkGUGv" }, "source": [ "The model will use an LSTM layer and a single linear layer to produce a softmax\n", "of the next character. Various hyperparameters can be chosen to modify this\n", "model. A messy detail is that two vectors, h0 and c0, have to be created for the hidden state in the LSTM layer (these correspond to the two connections\n", "shown in lecture for an LSTM neuron to send to itself in the next time step). " ] }, { "cell_type": "code", "execution_count": 64, "metadata": { "id": "dzDdSvGAGUGw" }, "outputs": [], "source": [ "from os import device_encoding\n", "class Model(nn.Module):\n", " def __init__(self, input_size, output_size, hidden_dim, n_layers,dropout):\n", " super(Model, self).__init__()\n", "\n", " # Defining some parameters\n", " self.hidden_dim = hidden_dim\n", " self.n_layers = n_layers\n", "\n", " #Defining the layers\n", " self.lstm = nn.LSTM(input_size, hidden_dim, n_layers,dropout=dropout,batch_first=True)\n", " # Fully connected layer\n", " self.fc1 = nn.Linear(hidden_dim, output_size)\n", "\n", " def forward(self, x):\n", "\n", " hidden_state_size = x.size(0)\n", "\n", " x = x.to(torch.double)\n", "\n", " h0 = torch.zeros(self.n_layers,hidden_state_size,self.hidden_dim).double().to(device)\n", " c0 = torch.zeros(self.n_layers,hidden_state_size,self.hidden_dim).double().to(device)\n", "\n", " self.lstm = self.lstm.double()\n", "\n", " self.fc1 = self.fc1.double()\n", "\n", " # Passing in the input and hidden state into the model and obtaining outputs\n", " out, (hx,cx) = self.lstm(x, (h0,c0))\n", "\n", " # Reshaping the outputs such that it can be fit into the fully connected layer\n", " out = out.contiguous().view(-1, self.hidden_dim)\n", " out = self.fc1(out)\n", "\n", " return out\n", "\n" ] }, { "cell_type": "markdown", "source": [ "Next, we instantiate the model with its hyperparameters, all of which can be\n", "changed." ], "metadata": { "id": "xr571AKQXDQH" } }, { "cell_type": "code", "execution_count": 65, "metadata": { "id": "zNUwEsRDGUGw", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "6331a61c-88fe-441c-ed46-1ce08d06d502" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Model(\n", " (lstm): LSTM(62, 256, batch_first=True)\n", " (fc1): Linear(in_features=256, out_features=62, bias=True)\n", ")\n" ] } ], "source": [ "# Instantiate the model with hyperparameters\n", "\n", "model = Model(input_size=num_chars, output_size=num_chars, hidden_dim=256, n_layers=1,dropout=0.0)\n", "\n", "print(model)\n", "\n", "model = model.double().to(device)\n", "\n", "# Define Loss, Optimizer\n", "loss_fn = nn.CrossEntropyLoss()\n", "\n", "optimizer = torch.optim.Adam(model.parameters(), lr=0.001,weight_decay=0.0)\n", "\n" ] }, { "cell_type": "markdown", "source": [ "The following is a minimal training loop. We just track the loss, since accuracy\n", "is not the point of a generative model.\n", "\n", "However, overfitting is very much a problem. You will see that overfitting has occurred when you give as prompt a prefix of the text (say the first line) and in generation it just spits out the text (which it has memorized)." ], "metadata": { "id": "rDrJKscMXaU0" } }, { "cell_type": "code", "execution_count": 66, "metadata": { "scrolled": false, "colab": { "base_uri": "https://localhost:8080/", "height": 487 }, "id": "HVClRHtXGUGw", "outputId": "9b729a48-5cbf-46cd-f76a-149401ac695a" }, "outputs": [ { "output_type": "stream", "name": "stderr", "text": [ "100%|██████████| 10/10 [01:54<00:00, 11.45s/it]\n" ] }, { "output_type": "execute_result", "data": { "text/plain": [ "[]" ] }, "metadata": {}, "execution_count": 66 }, { "output_type": "display_data", "data": { "text/plain": [ "
" ], "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiMAAAGzCAYAAAD9pBdvAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABAd0lEQVR4nO3deVyVdf7+8escdhVQRBYFXMF93zdcUlvMsprKarJtmkpwqab5astki5I1TTUttufMlMtobplWZgrivmHiAikquIC4cRDksJzz+wNj4pcoIHAfOK/n43H+8D73zbnOMHGux+e8z+eY7Ha7XQAAAAYxGx0AAAA4N8oIAAAwFGUEAAAYijICAAAMRRkBAACGoowAAABDUUYAAIChKCMAAMBQlBEAAGAoyggAADAUZQTANZkzZ45MJpO2b99udBQAtRRlBAAAGIoyAgAADEUZAVDtdu3apRtvvFE+Pj5q0KCBrrvuOm3evLnUOQUFBXrppZcUHh4uT09PNW7cWIMGDdLq1atLzklPT9dDDz2kkJAQeXh4KDg4WLfeequOHDlSw88IQFVyNToAgLpt7969Gjx4sHx8fPTXv/5Vbm5u+uijjzR06FDFxsaqb9++kqTp06crJiZGf/rTn9SnTx9ZLBZt375dO3fu1MiRIyVJd9xxh/bu3auJEyeqRYsWOnXqlFavXq3U1FS1aNHCwGcJ4FqY7Ha73egQAGqvOXPm6KGHHtK2bdvUq1ev391/2223aeXKldq/f79atWolSTp58qTatm2r7t27KzY2VpLUrVs3hYSEaMWKFZd9nPPnz6tRo0Z644039Je//KX6nhCAGsfbNACqTVFRkX744QeNHTu2pIhIUnBwsO69917Fx8fLYrFIkho2bKi9e/fql19+uezP8vLykru7u9atW6dz587VSH4ANYMyAqDaZGZmKjc3V23btv3dfe3bt5fNZlNaWpok6eWXX9b58+cVERGhzp0765lnntHPP/9ccr6Hh4dmzZqlVatWKTAwUJGRkXr99deVnp5eY88HQPWgjABwCJGRkTp06JA+//xzderUSZ9++ql69OihTz/9tOScKVOmKDk5WTExMfL09NQLL7yg9u3ba9euXQYmB3CtKCMAqk2TJk1Ur149JSUl/e6+AwcOyGw2KzQ0tOSYn5+fHnroIc2bN09paWnq0qWLpk+fXuq61q1b6+mnn9YPP/ygxMRE5efn680336zupwKgGlFGAFQbFxcXjRo1SsuWLSv18duMjAzNnTtXgwYNko+PjyTpzJkzpa5t0KCB2rRpI6vVKknKzc1VXl5eqXNat24tb2/vknMA1E58tBdAlfj888/13Xff/e749OnTtXr1ag0aNEgTJkyQq6urPvroI1mtVr3++usl53Xo0EFDhw5Vz5495efnp+3bt2vRokWKjo6WJCUnJ+u6667TXXfdpQ4dOsjV1VVLlixRRkaGxo0bV2PPE0DV46O9AK7Jrx/tLUtaWpoyMzM1bdo0bdiwQTabTX379tWMGTPUv3//kvNmzJih5cuXKzk5WVarVc2bN9f999+vZ555Rm5ubjpz5oxefPFFrVmzRmlpaXJ1dVW7du309NNP684776yJpwqgmlBGAACAoZgZAQAAhqKMAAAAQ1FGAACAoSgjAADAUJQRAABgKMoIAAAwVK3Y9Mxms+nEiRPy9vaWyWQyOg4AACgHu92u7OxsNW3aVGZz2esftaKMnDhxotT3VwAAgNojLS1NISEhZd5fK8qIt7e3pOIn8+v3WAAAAMdmsVgUGhpa8jpellpRRn59a8bHx4cyAgBALXO1EQsGWAEAgKEoIwAAwFCUEQAAYCjKCAAAMBRlBAAAGIoyAgAADEUZAQAAhqKMAAAAQ1FGAACAoSgjAADAUJQRAABgKMoIAAAwlFOXkd1p53XvJ5t1+oLV6CgAADgtpy0jdrtdzy3do42HzujVFfuMjgMAgNNy2jJiMpk0Y2xnmUzS0oQTik3ONDoSAABOyWnLiCR1DW2oBwe0kCQ9t2SPcvMLjQ0EAIATcuoyIklPj2qrpr6eOnbuot7+8Rej4wAA4HScvow08HDVq7d1kiR9uj5FicezDE4EAIBzcfoyIknD2wVqdJdg2ezS1MU/q7DIZnQkAACcBmXkkhfHdJCPp6sSj1s0Z+MRo+MAAOA0KCOXBHh76tmb2kuS3vwhWWlncw1OBACAc6CM/MZdvULVp6WfLhYU6fmlibLb7UZHAgCgzqOM/IbZbFLM7Z3l7mJWbHKmlu8+YXQkAADqPMrI/6d1kwaKHt5GkvTyN/t0Liff4EQAANRtlJHLeHxIa4UHNNCZnHzNXLnf6DgAANRplJHLcHc167U7OkuSFu44po0HTxucCACAuosyUoaezf30x35hkqRnl+xRXkGRwYkAAKibKCNX8Ncb2inQx0NHzuTqn2vYKh4AgOpAGbkCH083vXRL8VbxH8elaP9Ji8GJAACoeygjV3FDpyBd3zFQhTa7pi3eoyIbe48AAFCVKlRGZs+erS5dusjHx0c+Pj7q37+/Vq1adcVrFi5cqHbt2snT01OdO3fWypUrrymwEV66pZO8PVyVkHZe/9l0xOg4AADUKRUqIyEhIXrttde0Y8cObd++XcOHD9ett96qvXv3Xvb8jRs36p577tEjjzyiXbt2aezYsRo7dqwSExOrJHxNCfL11F9vbCdJeuP7JJ04f9HgRAAA1B0m+zXuee7n56c33nhDjzzyyO/uu/vuu5WTk6MVK1aUHOvXr5+6deumDz/8sNyPYbFY5Ovrq6ysLPn4+FxL3Eqz2ey686NN2nH0nEa0D9An43vJZDIZkgUAgNqgvK/flZ4ZKSoq0vz585WTk6P+/ftf9pxNmzZpxIgRpY5df/312rRp0xV/ttVqlcViKXUz2q9bxbu5mPTj/lNalZhudCQAAOqECpeRPXv2qEGDBvLw8NDjjz+uJUuWqEOHDpc9Nz09XYGBgaWOBQYGKj39yi/kMTEx8vX1LbmFhoZWNGa1iAj01hNDWkuSXly+V1kXCwxOBABA7VfhMtK2bVslJCRoy5YteuKJJ/TAAw9o3759VRpq2rRpysrKKrmlpaVV6c+/FhOGtVGrJvWVmW3Va6sOGB0HAIBar8JlxN3dXW3atFHPnj0VExOjrl276p133rnsuUFBQcrIyCh1LCMjQ0FBQVd8DA8Pj5JP7Px6cxSebi6aeVvxVvHztqZq6+GzBicCAKB2u+Z9Rmw2m6xW62Xv69+/v9asWVPq2OrVq8ucMakt+rVqrHG9i986mrb4Z1kL2SoeAIDKqlAZmTZtmuLi4nTkyBHt2bNH06ZN07p163TfffdJksaPH69p06aVnD958mR99913evPNN3XgwAFNnz5d27dvV3R0dNU+CwNMu7G9/Bt46FBmjj5Ye8joOAAA1FoVKiOnTp3S+PHj1bZtW1133XXatm2bvv/+e40cOVKSlJqaqpMnT5acP2DAAM2dO1cff/yxunbtqkWLFmnp0qXq1KlT1T4LA/jWc9P0W4oHdz9Yd1AHT2UbnAgAgNrpmvcZqQmOsM/I5djtdj3yr+366cAp9WreSP99rL/MZvYeAQBAqoF9RiCZTCa9MraT6rm7aPvRc5q3LdXoSAAA1DqUkWvUrKGX/jKqrSTptZUHlGHJMzgRAAC1C2WkCjwwoIW6hvgq21qo6csv/z09AADg8igjVcDFbFLM7V3kYjZpVWK6ftjLVvEAAJQXZaSKdGjqo0cHt5Ik/W3ZXmXnsVU8AADlQRmpQlNGhKt543pKt+Tp798nGR0HAIBagTJShTzdXDRjbPFW8f/efFQ7jp4zOBEAAI6PMlLFBoX7644eIbLbpWcX71F+oc3oSAAAODTKSDV4bnR7+dV3V1JGtj5Zn2J0HAAAHBplpBr41XfXCze3lyS9s+YXpWReMDgRAACOizJSTcZ2a6bB4f7KL7Tp2SV7VAt23QcAwBCUkWpiMpk0Y2xnebqZtTnlrBZuP2Z0JAAAHBJlpBqFNa6np0ZGSJJmrNyvzGyrwYkAAHA8lJFq9vDAluoQ7KOsiwV6ZcU+o+MAAOBwKCPVzNXFrFl3dJHZJC3ffUJrk04ZHQkAAIdCGakBnUN89fDAlpKk55ckKsdaaHAiAAAcB2Wkhjw5MkLNGnrp+PmL+sfqZKPjAADgMCgjNaS+h6teva2TJOmLDYf187HzxgYCAMBBUEZq0LC2Abqla1PZ7NLUr/eosIit4gEAoIzUsL+N6SBfLzftO2nRZ/GHjY4DAIDhKCM1zL+Bh54bXbxV/Fs/Jiv1TK7BiQAAMBZlxAB39gxR/1aNlVdg03NL2SoeAODcKCMGMJlMmnl7Z7m7mrX+l9NamnDc6EgAABiGMmKQlv71Nfm6cEnSKyv262xOvsGJAAAwBmXEQH+ObKW2gd46m5OvV79lq3gAgHOijBjIzcWsmDs6y2SSFu88rvhfThsdCQCAGkcZMViPsEYa36+5JOnZJXt0Mb/I4EQAANQsyogDeOaGdgr29VTq2Vy9s+YXo+MAAFCjKCMOoIGHq16+tXir+E/Wp2jviSyDEwEAUHMoIw5iZIdA3dQ5SEU2u6Yt3qMiG3uPAACcA2XEgUwf01Henq76+ViW/rXxiNFxAACoEZQRBxLg46mpN7aTJP39hyQdO8dW8QCAuo8y4mDu6R2m3i0aKTe/SH9btpet4gEAdR5lxMGYzSbF3N5Z7i5m/XTglFb8fNLoSAAAVCvKiANqE+CtCcNaS5Je+mavsnILDE4EAED1oYw4qCeGtlabgAY6fSFfMav2Gx0HAIBqQxlxUB6uLoq5vbMkaf62NG1OOWNwIgAAqkeFykhMTIx69+4tb29vBQQEaOzYsUpKSrrqdW+//bbatm0rLy8vhYaG6sknn1ReXl6lQzuL3i38dG/fMEnSs4v3KK+AreIBAHVPhcpIbGysoqKitHnzZq1evVoFBQUaNWqUcnJyyrxm7ty5mjp1ql588UXt379fn332mRYsWKBnn332msM7g/+7oZ0CvD2UcjpH7689aHQcAACqnMl+DZ8dzczMVEBAgGJjYxUZGXnZc6Kjo7V//36tWbOm5NjTTz+tLVu2KD4+vlyPY7FY5Ovrq6ysLPn4+FQ2bq21as9JPfHVTrmaTfp20mC1DfI2OhIAAFdV3tfva5oZycoq/g4VPz+/Ms8ZMGCAduzYoa1bt0qSUlJStHLlSt10001lXmO1WmWxWErdnNkNnYI0on2gCm12TVv8s2xsFQ8AqEMqXUZsNpumTJmigQMHqlOnTmWed++99+rll1/WoEGD5ObmptatW2vo0KFXfJsmJiZGvr6+JbfQ0NDKxqwTTCaTXhnbUfXdXbQz9by+2nLU6EgAAFSZSpeRqKgoJSYmav78+Vc8b926dZo5c6Y++OAD7dy5U4sXL9a3336rV155pcxrpk2bpqysrJJbWlpaZWPWGcG+XvrrDcVbxc/6LknpWQwAAwDqhkrNjERHR2vZsmWKi4tTy5Ytr3ju4MGD1a9fP73xxhslx7788kv9+c9/1oULF2Q2X70POfvMyK+KbHbdMXujEtLOa1SHQH08vpfRkQAAKFO1zIzY7XZFR0dryZIl+umnn65aRCQpNzf3d4XDxcWl5Oeh/FzMJr12R2e5mk36YV+GvktMNzoSAADXrEJlJCoqSl9++aXmzp0rb29vpaenKz09XRcvXiw5Z/z48Zo2bVrJv8eMGaPZs2dr/vz5Onz4sFavXq0XXnhBY8aMKSklKL92QT56bEgrSdKLyxNlyWOreABA7eZakZNnz54tSRo6dGip41988YUefPBBSVJqamqplZDnn39eJpNJzz//vI4fP64mTZpozJgxmjFjxrUld2ITh4dr5Z50HT6do9e/O6BXx3Y2OhIAAJV2TfuM1BRmRn5v46HTuveTLZKkRY/3V68WZX+8GgAAI9TIPiMwzoDW/rqzZ4gkadriPbIWslU8AKB2oozUYs+Nbq/G9d31y6kL+ig2xeg4AABUCmWkFmtYz11/G9NBkvTeTwd18NQFgxMBAFBxlJFa7pauTTW0bRPlF9n0t2WJfFwaAFDrUEZqOZPJpFdu7SQPV7M2Hjqjb/ecNDoSAAAVQhmpA0L96mnC0DaSpFdW7NMFa6HBiQAAKD/KSB3x2JBWCvOrpwyLVe+u+cXoOAAAlBtlpI7wdHPR9FuKh1k/iz+sg6eyDU4EAED5UEbqkOHtAjWifaAKbXb9bdlehlkBALUCZaSOeXFMh5Jh1hU/M8wKAHB8lJE65rfDrK9+yzArAMDxUUbqIIZZAQC1CWWkDvr/h1l/yWCYFQDguCgjddTwdoEa2YFhVgCA46OM1GF/u7l4mHVTyhl9wzArAMBBUUbqsFC/eooaVjzMOoNhVgCAg6KM1HF/jmyl5o2Lh1n/yTArAMABUUbquOJh1o6SpM8ZZgUAOCDKiBMY1jaAYVYAgMOijDgJhlkBAI6KMuIkQv3qKfrSMOurKxhmBQA4DsqIE3n00jDrqWyr3vkx2eg4AABIoow4ld8Os36x4YiSGWYFADgAyoiTGdY2QKNKhlkTGWYFABiOMuKEXri5gzzdzNqcclbLd58wOg4AwMlRRpxQqF89RQ39dWfW/crOKzA4EQDAmVFGnNSjka3U4tIwKzuzAgCMRBlxUqV2Zt1wREnpDLMCAIxBGXFiQ9sG6PqOgSpimBUAYCDKiJP7dZh1y2GGWQEAxqCMOLmQRv/bmZVhVgCAESgjKDXM+s6PDLMCAGoWZQTycP3NzqwbGWYFANQsyggkMcwKADAOZQQlGGYFABiBMoISIY3qaeLwcEnSqwyzAgBqCGUEpfxpcEu19K+vzGyr3maYFQBQAypURmJiYtS7d295e3srICBAY8eOVVJS0lWvO3/+vKKiohQcHCwPDw9FRERo5cqVlQ6N6vPbYdY5DLMCAGpAhcpIbGysoqKitHnzZq1evVoFBQUaNWqUcnJyyrwmPz9fI0eO1JEjR7Ro0SIlJSXpk08+UbNmza45PKrHkIgmuqFjkIpsdr3AMCsAoJq5VuTk7777rtS/58yZo4CAAO3YsUORkZGXvebzzz/X2bNntXHjRrm5uUmSWrRoUbm0qDEvjOmgdcmntPXwWS1LOKGx3SmPAIDqcU0zI1lZWZIkPz+/Ms9Zvny5+vfvr6ioKAUGBqpTp06aOXOmioqKyrzGarXKYrGUuqFmNWvoVTLMOmMlw6wAgOpT6TJis9k0ZcoUDRw4UJ06dSrzvJSUFC1atEhFRUVauXKlXnjhBb355pt69dVXy7wmJiZGvr6+JbfQ0NDKxsQ1YJgVAFATTPZKDgQ88cQTWrVqleLj4xUSElLmeREREcrLy9Phw4fl4uIiSfrHP/6hN954QydPnrzsNVarVVarteTfFotFoaGhysrKko+PT2XiopJikzP1wOdb5WI26dtJg9QuiP/9AQDlY7FY5Ovre9XX70qtjERHR2vFihVau3btFYuIJAUHBysiIqKkiEhS+/btlZ6ervz8/Mte4+HhIR8fn1I3GGNIRBPd2Kl4mPVvS/cyzAoAqHIVKiN2u13R0dFasmSJfvrpJ7Vs2fKq1wwcOFAHDx6UzWYrOZacnKzg4GC5u7tXPDFq3PM3d5CXm4u2HjmrpQnHjY4DAKhjKlRGoqKi9OWXX2ru3Lny9vZWenq60tPTdfHixZJzxo8fr2nTppX8+4knntDZs2c1efJkJScn69tvv9XMmTMVFRVVdc8C1apZQy9FD28jSZq58oAsDLMCAKpQhcrI7NmzlZWVpaFDhyo4OLjktmDBgpJzUlNTS82ChIaG6vvvv9e2bdvUpUsXTZo0SZMnT9bUqVOr7lmg2v1pcEu1+nWYdTXDrACAqlPpAdaaVN4BGFSvuORMjWeYFQBQTtU6wArnFBnRRDd1ZpgVAFC1KCOokOdHM8wKAKhalBFUSNOGXpp4XfEw64xvGWYFAFw7yggq7E+DWqlVk/o6fcGqt1YnGx0HAFDLUUZQYe6uZr10S0dJ0r83HdX+k3x3EACg8igjqJTB4b8ZZl2WyDArAKDSKCOotF+HWbcdOacluxhmBQBUDmUElda0oZcmXRcuiZ1ZAQCVRxnBNXlkUEuGWQEA14Qygmvi7mrWy7d0kiT9a+MR7TvBMCsAoGIoI7hmg8L9NbpzsGx2McwKAKgwygiqxPM3t1c9dxdtP3pOi3cyzAoAKD/KCKpEsO//hlljVu1X1kWGWQEA5UMZQZV5eGBLtW5SX6cv5DPMCgAoN8oIqkzxzqzFw6z/3sQwKwCgfCgjqFKDwv01ugvDrACA8qOMoMo9P5phVgBA+VFGUOUYZgUAVARlBNWCYVYAQHlRRlAt3F3NevlWhlkBAFdHGUG1Gdim9DCrzcYwKwDg9ygjqFalhll3McwKAPg9ygiqVbCvlyb/Osy6kmFWAMDvUUZQ7R4a2FJtAhroTA7DrACA36OMoNq5u5r18i0dJRUPs+49kWVwIgCAI6GMoEYMaOOvm0uGWfcyzAoAKEEZQY15fnQH1XN30Q6GWQEAv0EZQY0J8vXUlBEMswIASqOMoEb9dpj1Hz8kGR0HAOAAKCOoUW4u/xtm/c/mowyzAgAoI6h5A9r4a0zXpgyzAgAkUUZgkOduaq/6l4ZZv955zOg4AAADUUZgiCBfT02+NMz62qoDysplmBUAnBVlBIZ5aGBLhV8aZn1zNcOsAOCsKCMwjJuLWS/dWjzM+uXmo0o8zjArADgjyggMNaC1v24pGWZNZJgVAJwQZQSGe2508TDrztTzWsQwKwA4HcoIDBfo46kpIyIkMcwKAM6oQmUkJiZGvXv3lre3twICAjR27FglJZV/8HD+/PkymUwaO3ZsRXOijntwYAuFBzTQ2Zx8/fk/25VhyTM6EgCghlSojMTGxioqKkqbN2/W6tWrVVBQoFGjRiknJ+eq1x45ckR/+ctfNHjw4EqHRd3l5mLWa3d0Vj13F205fFY3vrNeaw+cMjoWAKAGmOx2e6UnBjMzMxUQEKDY2FhFRkaWeV5RUZEiIyP18MMPa/369Tp//ryWLl1a5vlWq1VWq7Xk3xaLRaGhocrKypKPj09l46IWSMm8oOi5u7TvpEWS9KdBLfXXG9rJ3ZV3FAGgtrFYLPL19b3q6/c1/YXPyir+KKafn98Vz3v55ZcVEBCgRx55pFw/NyYmRr6+viW30NDQa4mJWqRVkwZaEjVADw5oIUn6NP6w7pi9UUdOX331DQBQO1V6ZcRms+mWW27R+fPnFR8fX+Z58fHxGjdunBISEuTv768HH3yQlRGUy4/7MvTMot06l1ug+u4umnFbZ43t3szoWACAcqr2lZGoqCglJiZq/vz5ZZ6TnZ2t+++/X5988on8/f3L/bM9PDzk4+NT6gbnM6JDoFZOHqw+Lf2Uk1+kKQsS9PR/dyvHWmh0NABAFarUykh0dLSWLVumuLg4tWzZsszzEhIS1L17d7m4uJQcs9lskiSz2aykpCS1bt36qo9X3maFuqnIZtd7Px3UO2uSZbNLrfzr6917u6tjU1+jowEArqC8r98VKiN2u10TJ07UkiVLtG7dOoWHh1/x/Ly8PB08eLDUseeff17Z2dl65513FBERIXd396s+LmUEkrQl5YymLEjQyaw8ubuY9exN7fTAgBYymUxGRwMAXEZ5X79dK/JDo6KiNHfuXC1btkze3t5KT0+XJPn6+srLy0uSNH78eDVr1kwxMTHy9PRUp06dSv2Mhg0bStLvjgNX07dVY62cNFjPLPpZP+7P0PRv9in+4Bm98YcualT/6qUWAOCYKjQzMnv2bGVlZWno0KEKDg4uuS1YsKDknNTUVJ08ebLKgwKS1Ki+uz4Z31Mv3dJR7i5m/bg/Qze+s15bUs4YHQ0AUEnXtM9ITeFtGlzO3hNZmjhvl1Iyc2Q2SZOuC9fE4eFyMfO2DQA4ghrZZwQwUsemvvomepD+0DNENrv09o+/6J5PNutk1kWjowEAKoAyglqtvoer/n5nV719dzfVd3fR1ktbya/el2F0NABAOVFGUCeM7d5M304arM7NfHU+t0CP/nu7pi/fq7yCIqOjAQCugjKCOqOFf319/cQAPTq4eO+bORuP6PYPNupQ5gWDkwEAroQygjrF3dWs50Z30BcP9pZffXftO2nRmHfjtWjHMdWCWW0AcEqUEdRJw9oFaNXkwRrQurFy84v0l4W79dR/d+sCW8kDgMOhjKDOCvTx1H8e6au/jIqQi9mkJbuO6+Z/rteeY1lGRwMA/AZlBHWai9mk6OHhWvDnfmrW0EtHzuTq9tkb9On6FNlsvG0DAI6AMgKn0KuFn1ZOGqwbOgapoMiuV7/dr0f+tU1nLliNjgYATo8yAqfhW89Ns//YQ6+O7SR3V7PWJmXqxnfWa+Oh00ZHAwCnRhmBUzGZTPpjv+ZaHj1QbQIa6FS2Vfd9ukVv/pCkwiKb0fEAwClRRuCU2gX5aHn0QI3rHSq7XXr3p4Ma9/FmHT/PVvIAUNMoI3Ba9dxd9dodXfTuPd3l7eGq7UfP6ca34/RdIt86DQA1iTICpzema1N9O2mwuoY2lCWvUI9/uVPPL93DVvIAUEMoI4CksMb1tOjx/npsSCtJ0pebUzX2/Q06eCrb4GQAUPdRRoBL3FzMmnZje/3r4T7yb+CuA+nZuvndeC3YlspW8gBQjSgjwP9nSEQTrZw8WIPD/ZVXYNP/fb1HE+ftkiWvwOhoAFAnUUaAywjw9tS/Huqj/7uhnVzNJq34+aRG/3O9EtLOGx0NAOocyghQBrPZpCeGttZ/H++vkEZeSjt7UX+YvVEfxR5iK3kAqEKUEeAqeoQ10reTBmt052AV2uyKWXVAD87ZpsxstpIHgKpAGQHKwdfLTe/d210xt3eWp5tZccnFW8mv/yXT6GgAUOtRRoByMplMuqdPmJZHD1JEYAOdvmDV+M+3atZ3B1TAVvIAUGmUEaCCIgK9tTx6kO7rGya7XZq97pDu+miT0s7mGh0NAGolyghQCZ5uLppxW2fNvq+HvD1dtSv1vG56Z72+/Zmt5AGgoigjwDW4sXOwVk4arB5hDZVtLVTU3J2atvhnXcxnK3kAKC/KCHCNQv3qacFj/RU1rLVMJmne1jTd8l68NqecYedWACgHk70W/LW0WCzy9fVVVlaWfHx8jI4DlGnDwdOasiCh5GO/rZvU1z19wnR7jxD51Xc3OB0A1Kzyvn5TRoAqdvqCVW/+kKRlCSeUe+ntGncXs0Z1DNQ9fcLUv1Vjmc0mg1MCQPWjjAAGy84r0PLdJzR/a5r2HM8qOd68cT3d3TtUf+gZogBvTwMTAkD1oowADiTxeJbmb0vV0l0ndMFaKElyNZs0on2gxvUJ1eDwJnJhtQRAHUMZARxQbn6hVvx8UvO3pmpn6vmS480aeunu3qG6s1eIgn29jAsIAFWIMgI4uKT0bM3bmqolu44r62KBJMlskoa1DdC4PmEa1raJXF34wBuA2osyAtQSeQVF+i4xXfO2pmrL4bMlxwN9PHRXr1Dd1StUoX71DEwIAJVDGQFqoUOZF7RgW5oW7Timszn5kiSTSRrUxl/39AnTiPaBcndltQRA7UAZAWqx/EKbVu/L0LytqYo/eLrkuH8Dd93RM0TjeoeppX99AxMCwNVRRoA6IvVMrhZsT9XC7cd06tJmapLUr5Wf7ukTpus7BsnTzcXAhABweZQRoI4pLLLppwOnNG9rqmKTM2W79F9uw3puur17iO7pE6rwQG9jQwLAb5T39btCbz7HxMSod+/e8vb2VkBAgMaOHaukpKQrXvPJJ59o8ODBatSokRo1aqQRI0Zo69atFXlYAJJcXcwa1TFIXzzUR/H/N1xTRoSrqa+nzucW6PMNhzXyrTj9YfZGLdpxjC/qA1CrVGhl5IYbbtC4cePUu3dvFRYW6tlnn1ViYqL27dun+vUv//71fffdp4EDB2rAgAHy9PTUrFmztGTJEu3du1fNmjUr1+OyMgJcXpHNrrjkTM3bmqo1B06p6NJyibenq8Z2a6ZxfULVsamvwSkBOKsaeZsmMzNTAQEBio2NVWRkZLmuKSoqUqNGjfTee+9p/Pjx5bqGMgJc3SlLnhbuOKb521KVdvZiyfGuIb4a1ydMY7o2VQMPVwMTAnA25X39vqa/TFlZxd+34efnV+5rcnNzVVBQcMVrrFarrNb/DepZLJbKhwScRICPp6KGtdETQ1pr46EzmrctVT/sTdfuY1nafWyPXl2xT7d0a6pxvcPUJcRXJhPbzwNwDJVeGbHZbLrlllt0/vx5xcfHl/u6CRMm6Pvvv9fevXvl6Xn5LwmbPn26Xnrppd8dZ2UEqJgzF6z6eucxzd+appTTOSXH2wf76J4+obq1WzP5erkZmBBAXVbtb9M88cQTWrVqleLj4xUSElKua1577TW9/vrrWrdunbp06VLmeZdbGQkNDaWMAJVkt9u19fBZzd+Wpm/3nFR+oU2S5Olm1ujOTXVPn1D1bN6I1RIAVapay0h0dLSWLVumuLg4tWzZslzX/P3vf9err76qH3/8Ub169arQ4zEzAlSd87n5WrLruOZvTVNSRnbJ8fCABhrXJ0y3d2+mRvXdDUwIoK6oljJit9s1ceJELVmyROvWrVN4eHi5rnv99dc1Y8YMff/99+rXr195H64EZQSoena7XbvSzmvellSt+PmkLhYUfxzY3cWsGzoFaVyfUPVv1ZjVEgCVVi1lZMKECZo7d66WLVumtm3blhz39fWVl1fx156PHz9ezZo1U0xMjCRp1qxZ+tvf/qa5c+dq4MCBJdc0aNBADRo0qNInA6ByLHkFWp5wQvO2pmrvif8NjLdoXE+PD2mtu3uHUkoAVFi1lJGy/hh98cUXevDBByVJQ4cOVYsWLTRnzhxJUosWLXT06NHfXfPiiy9q+vTp5XpcyghQc/Ycy9K8balannBCF6yFkoq/qO+1OzorpBHfHgyg/NgOHsA1ybEW6qstR/XmD8myFtrUwMNVz49uzyoJgHKrlu3gATiP+h6u+nNka62aPFg9mzfSBWuhpi7eowe+2KaTWRev/gMAoJwoIwCuqFWTBvrvY/313E3t5e5qVlxypkb9I07/3Z6mWrCwCqAWoIwAuCoXs0mPRrbSykmD1S20obKthfrrop/18JxtSs/KMzoegFqOMgKg3NoENNCix/tr6o3t5O5i1tqkTI16K1Zf7zjGKgmASqOMAKgQVxezHh/SWt9OGqQuIb6y5BXq6YW79ei/t+uUhVUSABVHGQFQKeGB3lr8xAA9c31bubmY9OP+Uxr5VpyW7jrOKgmACqGMAKg0Vxezooa10TcTB6lTMx9lXSzQlAUJeuw/O5SZbb36DwAAUUYAVIF2QT5aMmGgnhoZIVezST/sy9Cot2L1ze4TrJIAuCrKCIAq4eZi1qTrwrU8epDaB/voXG6BJs7bpQlf7dTpC6ySACgbZQRAlerQ1EfLogZq8nXhcjWbtCoxXaPeitPKPSeNjgbAQVFGAFQ5d1eznhwZoaVRA9UuyFtnc/I14audip67U2dz8o2OB8DBUEYAVJtOzXy1PHqQJg5vIxezSSt+PqlRb8Xqu8R0o6MBcCCUEQDVyt3VrKdHtdWSCQMUHtBApy/k6/Evd2jy/F06xyoJAFFGANSQLiENtWLSID0xtLXMJmlZwgmNejtOq/dlGB0NgMEoIwBqjIeri/7vhnb6+okBat2kvjKzrXr039v11IIEZeUWGB0PgEEoIwBqXPewRvp20mA9FtlKJpO0eNdxjXo7Vj8dYJUEcEaUEQCG8HRz0bSb2mvR4/3Vyr++MixWPTxnu55ZuFtZF1klAZwJZQSAoXo299PKyYP1p0EtZTJJC3cc0/VvxWld0imjowGoIZQRAIbzdHPR8zd30H8f668Wjesp3ZKnB7/Ypqlf/6zsPFZJgLqOMgLAYfRu4adVkyP10MAWkqT529J0/VtxWv9LprHBAFQryggAh+Ll7qIXx3TU/D/3U5hfPZ3IytP9n23Vs0v26IK10Oh4AKoBZQSAQ+rXqrFWTR6s8f2bS5LmbknV9W/FaePB0wYnA1DVKCMAHFZ9D1e9fGsnzX20r0Iaeen4+Yu699MtemFponJYJQHqDMoIAIc3oLW/vpsSqfv6hkmS/rP5qG54J06bU84YnAxAVaCMAKgVGni4asZtnfXlI33VrKGX0s5e1LiPN2v68r3KzWeVBKjNKCMAapVB4f76bspg3dMnVJI0Z+MR3fjOem07ctbgZAAqizICoNbx9nRTzO1d9K+H+yjY11NHz+Tqro826ZUV+3Qxv8joeAAqiDICoNYaEtFE3z8Zqbt6hchulz6LP6zR/1yvHUfPGR0NQAVQRgDUaj6ebnr9D131xYO9FejjoZTTObrzw42auXK/8gpYJQFqA8oIgDphWLsA/TBliG7v0Uw2u/RxXIpG/3O9dqWySgI4OsoIgDrDt56b/nFXN306vpeaeHvoUGaO7pi9Ua+tOsAqCeDATHa73W50iKuxWCzy9fVVVlaWfHx8jI4DoBY4n5uv6cv3amnCCUmSX313DQ7315CIJhoc3kRNvD0MTgjUfeV9/aaMAKjTvt+brueXJioz21rqeMemPoqMaKIhEU3UI6yR3F1ZKAaqGmUEAC4pKLJp59FzivslU7HJmUo8bil1f313Fw1o46/IiCYaGtFEoX71DEoK1C2UEQAoQ2a2VfEHMxWXfFpxyZk6k5Nf6v6W/vU1JKKJIiP81a9VY9VzdzUoKVC7UUYAoBxsNrv2nbQoNrl41WTn0XMqtP3vz6K7i1m9Wza6VE6aqG2gt0wmk4GJgdqDMgIAlZCdV6CNh84oNjlTccmZOnbuYqn7A308FBneREPaNtGgNv5qWM/doKSA46uWMhITE6PFixfrwIED8vLy0oABAzRr1iy1bdv2itctXLhQL7zwgo4cOaLw8HDNmjVLN910U5U/GQCoSna7XSmncxR3adVkc8oZ5RXYSu43m6SuoQ1LyknXkIZyMbNqAvyqWsrIDTfcoHHjxql3794qLCzUs88+q8TERO3bt0/169e/7DUbN25UZGSkYmJidPPNN2vu3LmaNWuWdu7cqU6dOlXpkwGA6pRXUKRtR86WlJPkjAul7vf1ctOgcH8NCS9+SyfI19OgpIBjqJG3aTIzMxUQEKDY2FhFRkZe9py7775bOTk5WrFiRcmxfv36qVu3bvrwww/L9TiUEQCO6GTWRcUlFw/Crv8lU5a8wlL3tw30VmSEv4ZEBKhXi0bydHMxKClgjPK+fl/TiHhWVpYkyc/Pr8xzNm3apKeeeqrUseuvv15Lly4t8xqr1Sqr9X97AlgsljLPBQCjBPt66e7eYbq7d5gKi2zafSyrZNVk97HzSsrIVlJGtj5Zf1iebmb1b9W4ZG+Tlv71GYQFLql0GbHZbJoyZYoGDhx4xbdb0tPTFRgYWOpYYGCg0tPTy7wmJiZGL730UmWjAUCNc3Uxq2fzRurZvJGeHBmhczn5ij94uqScnMq2am1SptYmZUqSQhp5lXxCZ0DrxvL2dDP4GQDGqXQZiYqKUmJiouLj46syjyRp2rRppVZTLBaLQkNDq/xxAKC6NKrvrjFdm2pM16ay2+1KyshWbFKm4n7J1LbD53Ts3EV9tSVVX21JlavZpB7Niz8+PCSiiToE+8jMICycSKXKSHR0tFasWKG4uDiFhIRc8dygoCBlZGSUOpaRkaGgoKAyr/Hw8JCHB98bAaBuMJlMahfko3ZBPnpsSGvl5hdqc8qZS+XktA6fztHWw2e19fBZvfF9kvwbuGtweHExGRTuL/8G/D1E3VahAVa73a6JEydqyZIlWrduncLDw696zd13363c3Fx98803JccGDBigLl26MMAKAJJSz+Qq9pfifU02HjytnPzS3zDcuZlvySBs97CGcnPhe3RQO1TLp2kmTJiguXPnatmyZaX2FvH19ZWXl5ckafz48WrWrJliYmIkFX+0d8iQIXrttdc0evRozZ8/XzNnzuSjvQBwGfmFNu1MPVe8I2xSpvadLD3A38TbQ38a1FL39WuuBh5sUw/HVi1lpKzJ7y+++EIPPvigJGno0KFq0aKF5syZU3L/woUL9fzzz5dsevb666+z6RkAlMOp7DytTz6tuF8ytf6X0zp76Xt0fL3c9NDAFnpwQAt2gYXDYjt4AKhj8gttWpZwXLPXHVLK6RxJxd84/Md+zfXI4JYK8GaTNTgWyggA1FFFNrtWJZ7U+2sPaf+lt3HcXc26u1eoHhvSSiGN6hmcEChGGQGAOs5ut2tt0im999NB7Uw9L0lyNZt0a7dmemJoa7UJaGBsQDg9yggAOAm73a7NKWf1/tqDij94WpJkMkk3dgrShKFt1KmZr8EJ4awoIwDghBLSzuv9tQe1et//9nca1raJooa1Ua8WZX91B1AdKCMA4MQOpFs0e90hfbP7hGyX/sr3bemn6OFtNKiNP9+LgxpBGQEA6MjpHH0Ye0hf7zymgqLiP/ddQ3w1YVgbjWwfyLbzqFaUEQBAiZNZF/VxXIrmbU1VXoFNkhQR2EAThrbRzV2C5cqurqgGlBEAwO+cvmDVFxsO698bjyrbWihJCvOrp8eHtNYdPZvJw9XF4ISoSygjAIAyZV0s0H82HdHnG46U7Ooa5OOpRyNb6Z4+oarnzlbzuHaUEQDAVeXmF2re1jR9HHdIGRarJMmvvrseHthC9/dvIV8vN4MTojajjAAAys1aWKTFO4u3mk89mytJ8vZw1f39m+vhQS3l38DD4ISojSgjAIAKKyyy6ds9J/X+2oNKzrggSfJ0M2tc7zA9NqSVgn29DE6I2oQyAgCoNJvNrtX7M/T+2oP6+ViWJMnNxaQ7eoTo8SGt1cK/vsEJURtQRgAA18xutyv+4Gm999NBbTl8VpJkNkk3d2mqCcNaq10Qf5NRNsoIAKBKbT9S/P03a5MyS46NaB+o6OFt1C20oXHB4LAoIwCAapF4PEsfrDuoVYnp+vUVZFAbf00Y1lr9WzVmq3mUoIwAAKrVwVMXNHvdIS1NOK6iS1+A0yOsoaKGtdHwdgGUElBGAAA1I+1srj6OS9GC7WnKLyzear59sI+ihrXWjZ2C5cL33zgtyggAoEadsuTps/jD+nLzUeXkF0mSWvnX1+NDW+u27s3kxvffOB3KCADAEOdz8/XFhiOas/GIsi4WSJKaNfTSnyNb6e7eofJ04/tvnAVlBABgqAvWQn21+ag+WX9Ypy8UbzXv38BDfxrcUvf1DZO3J1vN13WUEQCAQ8grKNLC7Wn6MDZFx89flFS81fzY7s10T58wdWjK3/W6ijICAHAoBUU2LUs4oQ/WHVRKZk7J8W6hDXVvnzDd3DWYbwuuYygjAACHZLPZtfHQGc3delQ/7M1Q4aWPBf+6WnJv3zC1D+ZvfV1AGQEAOLzMbKsW7TimeVtTS74tWLq0WtI3TDd3YbWkNqOMAABqjSutltzWo3i2hNWS2ocyAgColTKzrVq4I03zt6aVWi3pHtZQ9/RhtaQ2oYwAAGo1m82uDYdOa97WVFZLainKCACgzrjaasmYLk3l5c5mao6GMgIAqHPKXC3xdNVtlz6J0y6I1wlHQRkBANRpp7LztGjHscuultzbJ0w3s1piOMoIAMAp/LpaMndLqlbvK71acnv3ZrqH1RLDUEYAAE7nVHaeFm4/pvnbUpV29mLJcVZLjEEZAQA4LVZLHANlBAAAlb1a0qNk3xJWS6oLZQQAgN+w2eyKP1j8SZzLrZbc27e52gZ5G5yybqGMAABQhiutltzbt7lGdw5mtaQKlPf121zRHxwXF6cxY8aoadOmMplMWrp06VWv+eqrr9S1a1fVq1dPwcHBevjhh3XmzJmKPjQAAFUiwNtTUcPaKPYvw/Tvh/voxk5BcjWbtDP1vP6ycLf6zPxRLy5LVFJ6ttFRnUKFy0hOTo66du2q999/v1znb9iwQePHj9cjjzyivXv3auHChdq6daseffTRCocFAKAqmc0mRUY00ew/9tTGqcP1zPVtFernpey8Qv1r01Fd/3ac7pi9UYt2HNPF/CKj49ZZ1/Q2jclk0pIlSzR27Ngyz/n73/+u2bNn69ChQyXH3n33Xc2aNUvHjh0r1+PwNg0AoKb8Olsyd0uqftz/v9kSH09X3d4jRPf0CWO2pJyq7W2aiurfv7/S0tK0cuVK2e12ZWRkaNGiRbrpppvKvMZqtcpisZS6AQBQE35dLfnw/v+tloQ08pIlr1BzNh4ptVqSV8BqSVWo9jIycOBAffXVV7r77rvl7u6uoKAg+fr6XvFtnpiYGPn6+pbcQkNDqzsmAAC/E+BTPFsS98ww/evhPrqhY5BczCbtOHpOf1m4W/1i1uj9tQd1wVpodNRardrfptm3b59GjBihJ598Utdff71OnjypZ555Rr1799Znn3122WusVqusVmvJvy0Wi0JDQ3mbBgBguFOWPC3ccUzztqbq2LniT+L41XfXnyNbaXz/5qrn7mpwQsdRIx/tLU8Zuf/++5WXl6eFCxeWHIuPj9fgwYN14sQJBQcHX/VxmBkBADiaIptd3+w+oXfW/KLDp3MkSf4N3PX4kNa6r29zPhosB5oZyc3Nldlc+mFcXIp/QbVgixMAAC7LxWzS2O7NtPrJSP39zq4K86un0xfy9eq3+xX5xlp9seEwMyXlVOEycuHCBSUkJCghIUGSdPjwYSUkJCg1NVWSNG3aNI0fP77k/DFjxmjx4sWaPXu2UlJStGHDBk2aNEl9+vRR06ZNq+ZZAABgEFcXs/7QM0Rrnh6iWXd0VrOGXsrMtuqlb/Zp6Bvr9J9NR2QtpJRcSYXfplm3bp2GDRv2u+MPPPCA5syZowcffFBHjhzRunXrSu5799139eGHH+rw4cNq2LChhg8frlmzZqlZs2blekzepgEA1Bb5hTYt3JGm9346qJNZeZKkpr6eih4erj/0DJG7a7W/KeEw2A4eAAADWQuLtGBbmt5fe1AZluIPZYQ08tKk4eG6vUczubrU/VJCGQEAwAHkFRRp7pZUfbDukE5fKC4lzRvX0+TrwnVL16Z1upRQRgAAcCAX84v05eaj+jD2kM7k5EuSWjWpr8nXhevmLk3lYjYZnLDqUUYAAHBAOdZC/XvTUX0Ud0jncwskSW0CGmjKiHDd1ClY5jpUSigjAAA4sOy8Av1r4xF9HJciS17xDq5tA7315MhwjeoQVCdKCWUEAIBawJJXoM/jD+uz9YeVfWlb+Q7BPnpyZIRGtA+QyVR7SwllBACAWiQrt0Cfxqfo8/jDyskv3pekS4ivnhwRoaFtm9TKUkIZAQCgFjqbk69P1qdozoYjunhpB9duoQ311MgIDQ73r1WlhDICAEAtdvqCVR/Hpejfm44or8AmSerVvJGeGhmhAW38DU5XPpQRAADqgFPZefpwXYq+3HJU+YXFpaRfKz89OSJCfVs1NjjdlVFGAACoQzIsefpg7UHN25qm/KLiUjKwTWM9NTJCPZv7GZzu8igjAADUQSfOX9T7aw/qv9vTVFBU/BIeGdFET44IV/ewRganK40yAgBAHZZ2Nlfvrz2ohTuOqchW/FI+vF2AnhwRoc4hvganK0YZAQDACRw9k6N3fzqoxTuP6VIn0cgOgZoyIlwdmxpbSigjAAA4kZTMC3r3p4NamnBcv76y39gpSFNGRKhtkLchmSgjAAA4oYOnsvXOmoNa8fMJ2e2SySSN7hysKSPC1SagZksJZQQAACeWlJ6td9Yka+WedEnFpeTWrk016bpwtWrSoEYyUEYAAID2nbDo7R+T9cO+DEmS2STd3iNEk4aHK6xxvWp9bMoIAAAosedYlt7+MVlrDpySJLmYTfpDjxBFD2+jUL/qKSWUEQAA8DsJaef11upkxSZnSpJczSbd1TtU0cPaqGlDryp9rPK+fpur9FEBAIBD6xbaUP96uI++fqK/BrXxV6HNrrlbUrV89wnDMrka9sgAAMAwPZv76cs/9dWWlDP6YsMRje/f3LAslBEAAJxY31aNDf/CPd6mAQAAhqKMAAAAQ1FGAACAoSgjAADAUJQRAABgKMoIAAAwFGUEAAAYijICAAAMRRkBAACGoowAAABDUUYAAIChKCMAAMBQlBEAAGCoWvGtvXa7XZJksVgMTgIAAMrr19ftX1/Hy1Irykh2drYkKTQ01OAkAACgorKzs+Xr61vm/Sb71eqKA7DZbDpx4oS8vb1lMpmq7OdaLBaFhoYqLS1NPj4+VfZzUXn8ThwLvw/Hwu/DsfD7uDq73a7s7Gw1bdpUZnPZkyG1YmXEbDYrJCSk2n6+j48P/0dyMPxOHAu/D8fC78Ox8Pu4siutiPyKAVYAAGAoyggAADCUU5cRDw8Pvfjii/Lw8DA6Ci7hd+JY+H04Fn4fjoXfR9WpFQOsAACg7nLqlREAAGA8yggAADAUZQQAABiKMgIAAAxFGQEAAIZy6jLy/vvvq0WLFvL09FTfvn21detWoyM5pZiYGPXu3Vve3t4KCAjQ2LFjlZSUZHQsXPLaa6/JZDJpypQpRkdxasePH9cf//hHNW7cWF5eXurcubO2b99udCynVFRUpBdeeEEtW7aUl5eXWrdurVdeeeWqXwaHsjltGVmwYIGeeuopvfjii9q5c6e6du2q66+/XqdOnTI6mtOJjY1VVFSUNm/erNWrV6ugoECjRo1STk6O0dGc3rZt2/TRRx+pS5cuRkdxaufOndPAgQPl5uamVatWad++fXrzzTfVqFEjo6M5pVmzZmn27Nl67733tH//fs2aNUuvv/663n33XaOj1VpOu89I37591bt3b7333nuSir+MLzQ0VBMnTtTUqVMNTufcMjMzFRAQoNjYWEVGRhodx2lduHBBPXr00AcffKBXX31V3bp109tvv210LKc0depUbdiwQevXrzc6CiTdfPPNCgwM1GeffVZy7I477pCXl5e+/PJLA5PVXk65MpKfn68dO3ZoxIgRJcfMZrNGjBihTZs2GZgMkpSVlSVJ8vPzMziJc4uKitLo0aNL/XcCYyxfvly9evXSnXfeqYCAAHXv3l2ffPKJ0bGc1oABA7RmzRolJydLknbv3q34+HjdeOONBiervWrFt/ZWtdOnT6uoqEiBgYGljgcGBurAgQMGpYJUvEI1ZcoUDRw4UJ06dTI6jtOaP3++du7cqW3bthkdBZJSUlI0e/ZsPfXUU3r22We1bds2TZo0Se7u7nrggQeMjud0pk6dKovFonbt2snFxUVFRUWaMWOG7rvvPqOj1VpOWUbguKKiopSYmKj4+HijozittLQ0TZ48WatXr5anp6fRcaDikt6rVy/NnDlTktS9e3clJibqww8/pIwY4L///a+++uorzZ07Vx07dlRCQoKmTJmipk2b8vuoJKcsI/7+/nJxcVFGRkap4xkZGQoKCjIoFaKjo7VixQrFxcUpJCTE6DhOa8eOHTp16pR69OhRcqyoqEhxcXF67733ZLVa5eLiYmBC5xMcHKwOHTqUOta+fXt9/fXXBiVybs8884ymTp2qcePGSZI6d+6so0ePKiYmhjJSSU45M+Lu7q6ePXtqzZo1JcdsNpvWrFmj/v37G5jMOdntdkVHR2vJkiX66aef1LJlS6MjObXrrrtOe/bsUUJCQsmtV69euu+++5SQkEARMcDAgQN/93H35ORkNW/e3KBEzi03N1dmc+mXTxcXF9lsNoMS1X5OuTIiSU899ZQeeOAB9erVS3369NHbb7+tnJwcPfTQQ0ZHczpRUVGaO3euli1bJm9vb6Wnp0uSfH195eXlZXA65+Pt7f27eZ369eurcePGzPEY5Mknn9SAAQM0c+ZM3XXXXdq6das+/vhjffzxx0ZHc0pjxozRjBkzFBYWpo4dO2rXrl36xz/+oYcfftjoaLWX3Ym9++679rCwMLu7u7u9T58+9s2bNxsdySlJuuztiy++MDoaLhkyZIh98uTJRsdwat988429U6dOdg8PD3u7du3sH3/8sdGRnJbFYrFPnjzZHhYWZvf09LS3atXK/txzz9mtVqvR0Wotp91nBAAAOAannBkBAACOgzICAAAMRRkBAACGoowAAABDUUYAAIChKCMAAMBQlBEAAGAoyggAADAUZQQAABiKMgIAAAxFGQEAAIb6fyVacgzRXfDbAAAAAElFTkSuQmCC\n" }, "metadata": {} } ], "source": [ "num_epochs = 10\n", "\n", "losses = []\n", "\n", "model.train()\n", "\n", "for epoch in tqdm(range(num_epochs)):\n", "\n", " for input_seq_batch,target_seq_batch in data_loader:\n", " input_seq_batch = input_seq_batch.to(device)\n", " target_seq_batch = target_seq_batch.to(device)\n", " optimizer.zero_grad()\n", " target_seq_hat = model(input_seq_batch)\n", " loss = loss_fn(target_seq_hat,target_seq_batch.view(-1,num_chars))\n", " loss.backward()\n", " optimizer.step()\n", "\n", " losses.append(loss.item())\n", "\n", "\n", "plt.title('Loss')\n", "plt.plot(losses)" ] }, { "cell_type": "markdown", "source": [ "The temperature of a softmax function will determine the relative strength of different probabilities:\n", "- As temperature approaches 0, distribution approaches a one-hot with 1 for the max\n", "- As temperature increases, it approaches a uniform distribution\n", "\n", "Generally we want to emphasize the higher probabilities, so we choose\n", "a reasonably low temperature." ], "metadata": { "id": "uuyAwhLPXtGd" } }, { "cell_type": "code", "source": [ "\n", "def softmax_with_temperature(vec, temperature):\n", " sum_exp = sum(math.exp(x/temperature) for x in vec)\n", " return [math.exp(x/temperature)/sum_exp for x in vec]\n", "\n", "print(\"Example of softmax with temperature.\")\n", "dist = [0.1, 0.3, 0.6]\n", "print('distribution:',dist)\n", "print(softmax_with_temperature(dist,0.01))\n", "print(softmax_with_temperature(dist,0.1))\n", "print(softmax_with_temperature(dist,0.2))\n", "print(softmax_with_temperature(dist,0.3))\n", "print(softmax_with_temperature(dist,1))\n", "print(softmax_with_temperature(dist,10))" ], "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "Sg7j3CzGQDkf", "outputId": "35c12c65-d238-46c7-a812-4478cd0191fe" }, "execution_count": 67, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Example of softmax with temperature.\n", "distribution: [0.1, 0.3, 0.6]\n", "[1.9287498479637375e-22, 9.3576229688393e-14, 0.9999999999999064]\n", "[0.006377460922442302, 0.04712341652466416, 0.9464991225528936]\n", "[0.06289001324586753, 0.1709527801977903, 0.7661572065563421]\n", "[0.12132647558421489, 0.23631170657656433, 0.6423618178392208]\n", "[0.2583896517379799, 0.3155978333128144, 0.4260125149492058]\n", "[0.3255767455856355, 0.3321538321280155, 0.3422694222863489]\n" ] } ] }, { "cell_type": "markdown", "source": [ "Choose a temperature and predict the next character, given a prompt of arbitrary length." ], "metadata": { "id": "hLlygE7WYigs" } }, { "cell_type": "code", "execution_count": 68, "metadata": { "scrolled": false, "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "id": "kLJX_vFSGUGw", "outputId": "91ba9e04-6bbf-4b80-c302-db6fb13dbf55" }, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "'n'" ], "application/vnd.google.colaboratory.intrinsic+json": { "type": "string" } }, "metadata": {}, "execution_count": 68 } ], "source": [ "temperature = 0.3\n", "\n", "def predict(model, ch):\n", "\n", " # only look at last sample_len - 1 characters\n", "\n", " ch = ch[-(sample_len - 1):]\n", "\n", " # One-hot encoding our input to fit into the model\n", " ch = np.array([char2int(c) for c in ch])\n", " ch = np.array([int2OneHot(ch, num_chars)])\n", " ch = torch.from_numpy(ch).to(device)\n", "\n", " out = model(ch)\n", "\n", " # take the probability distribution of the last character in the sequence produced by the model\n", " prob = softmax_with_temperature(out[-1],temperature)\n", "\n", " # Choosing a character based on the probability distribution, with temperature\n", " char_ind = choice(list(range(num_chars)), p=prob)\n", "\n", " return int2char(char_ind)\n", "\n", "predict(model,\"Of man's first disobedience, and the fruit o\")" ] }, { "cell_type": "markdown", "source": [ "Now take a prompt and iterate the previous prediction a specified number of times.\n", "\n", "Prompt is generally taken to be a long sequence randomly selected from the text. You can also try a sequence of words similar to those in the text, but not an exact sequence. It does not have to be the exact length of the data sequences. However, very short prompts tend not to work as well." ], "metadata": { "id": "wThVXlXhYgqt" } }, { "cell_type": "code", "execution_count": 69, "metadata": { "id": "eFcGfQ8VGUGw" }, "outputs": [], "source": [ "def sample(model, out_len, start):\n", " model.eval() # eval mode\n", " # First off, run through the starting characters\n", " chars = [ch for ch in start]\n", " size = out_len - len(chars)\n", " # Now pass in the previous characters and get a new one\n", " for ii in range(size):\n", " char = predict(model, chars)\n", " chars.append(char)\n", "\n", " return ''.join(chars)" ] }, { "cell_type": "markdown", "source": [ "Now we will run our model, but with the parameters we have chosen, and\n", "10 epochs, you can see that it is getting some idea of words and lines, but\n", "it doesn't look like an English poem!\n", "\n", "Run this for another 100 epochs, and observe that at that point,\n", "the network will have simply memorized the poem!" ], "metadata": { "id": "LONowWPkg6bj" } }, { "cell_type": "code", "execution_count": 71, "metadata": { "scrolled": false, "colab": { "base_uri": "https://localhost:8080/" }, "id": "ra7VOpilGUGw", "outputId": "df558ad6-30d1-4840-932a-95e364358acf" }, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Of Man's first disobedience, and the fruit ros ore thall sus ingeres the store he preat hes pare,\n", "He veint of the gerat on the flaid the vinged of Heaven the wath of the gorthall in served on the gromithe firce sedpent his whale, and will his grong of Heaven on and will his from the prot the tho ders and cengen,\n", "With hese and rought force his fire,\n", "And simed tout for sowe,\n", "That whot ender st of the geall the ing hinger and ranger\n", "Th the his the farl se and rus ind and sure seall\n", "The force of the sore that for hised\n", "And with upmighto formte of the gore ther th of the seaven on the grom the farl seand sulled the the the his dering of Hian, whather th of beis and cous reaven he pering of the dere thers his fired\n", "That with of reaven on the derich and righty stored the the the flor the vence the the sing the bort of Heaven on and compinise, and his pire,\n", "And his om the Alill nised the serce on the derat of Heaven se pint,\n", "And cous ffre stound he pires\n", "And th ur poreid de and sure dain the s\n" ] } ], "source": [ "print(sample(model, 1000, \"Of Man's first disobedience, and the fruit\"))" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.9" }, "colab": { "provenance": [], "gpuType": "T4" }, "accelerator": "GPU" }, "nbformat": 4, "nbformat_minor": 0 }