fork download
  1. import numpy as np
  2. import tensorflow as tf
  3. from tensorflow.keras.models import Sequential
  4. from tensorflow.keras.layers import Embedding, SimpleRNN, Dense
  5. text = "This is a sample text for language modeling using RNN."
  6. chars = sorted(set(text))
  7. char_to_index = {char: index for index, char in enumerate(chars)}
  8. index_to_char = {index: char for index, char in enumerate(chars)}
  9. text_indices = [char_to_index[char] for char in text]
  10. seq_length,sequences,next_char = 20,[],[]
  11. for i in range(0, len(text_indices) - seq_length):
  12. sequences.append(text_indices[i : i + seq_length])
  13. next_char.append(text_indices[i + seq_length])
  14. X = np.array(sequences)
  15. y = np.array(next_char)
  16. 17
  17. model = Sequential([Embedding(input_dim=len(chars), output_dim=50, input_length=seq_length),SimpleRNN(100, return_sequences=False),Dense(len(chars), activation="softmax")])
  18. model.compile(loss="sparse_categorical_crossentropy", optimizer="adam")
  19. model.fit(X, y, batch_size=64, epochs=50)
  20. seed_text = "This is a sample te"
  21. generated_text = seed_text
  22. num_chars_to_generate = 100
  23. for _ in range(num_chars_to_generate):
  24. seed_indices = [char_to_index[char] for char in seed_text]
  25.  
  26. if len(seed_indices) < seq_length:
  27. diff = seq_length - len(seed_indices)
  28. seed_indices = [0] * diff + seed_indices
  29.  
  30. seed_indices = np.array(seed_indices).reshape(1, -1)
  31. next_index = model.predict(seed_indices).argmax()
  32. next_char = index_to_char[next_index]
  33. generated_text += next_char
  34. seed_text = seed_text[1:] + next_char
  35. print(generated_text)
Success #stdin #stdout #stderr 3.02s 240336KB
stdin
Standard input is empty
stdout
Epoch 1/50

34/34 [==============================] - 0s 12ms/sample - loss: 3.1165
Epoch 2/50

34/34 [==============================] - 0s 293us/sample - loss: 3.0272
Epoch 3/50

34/34 [==============================] - 0s 282us/sample - loss: 2.9392
Epoch 4/50

34/34 [==============================] - 0s 279us/sample - loss: 2.8495
Epoch 5/50

34/34 [==============================] - 0s 282us/sample - loss: 2.7589
Epoch 6/50

34/34 [==============================] - 0s 281us/sample - loss: 2.6720
Epoch 7/50

34/34 [==============================] - 0s 306us/sample - loss: 2.5855
Epoch 8/50

34/34 [==============================] - 0s 300us/sample - loss: 2.4892
Epoch 9/50

34/34 [==============================] - 0s 296us/sample - loss: 2.3828
Epoch 10/50

34/34 [==============================] - 0s 311us/sample - loss: 2.2736
Epoch 11/50

34/34 [==============================] - 0s 286us/sample - loss: 2.1687
Epoch 12/50

34/34 [==============================] - 0s 286us/sample - loss: 2.0686
Epoch 13/50

34/34 [==============================] - 0s 274us/sample - loss: 1.9693
Epoch 14/50

34/34 [==============================] - 0s 273us/sample - loss: 1.8707
Epoch 15/50

34/34 [==============================] - 0s 264us/sample - loss: 1.7767
Epoch 16/50

34/34 [==============================] - 0s 277us/sample - loss: 1.6852
Epoch 17/50

34/34 [==============================] - 0s 301us/sample - loss: 1.5922
Epoch 18/50

34/34 [==============================] - 0s 317us/sample - loss: 1.5046
Epoch 19/50

34/34 [==============================] - 0s 312us/sample - loss: 1.4229
Epoch 20/50

34/34 [==============================] - 0s 285us/sample - loss: 1.3423
Epoch 21/50

34/34 [==============================] - 0s 281us/sample - loss: 1.2652
Epoch 22/50

34/34 [==============================] - 0s 276us/sample - loss: 1.1911
Epoch 23/50

34/34 [==============================] - 0s 274us/sample - loss: 1.1183
Epoch 24/50

34/34 [==============================] - 0s 275us/sample - loss: 1.0486
Epoch 25/50

34/34 [==============================] - 0s 270us/sample - loss: 0.9834
Epoch 26/50

34/34 [==============================] - 0s 265us/sample - loss: 0.9212
Epoch 27/50

34/34 [==============================] - 0s 268us/sample - loss: 0.8605
Epoch 28/50

34/34 [==============================] - 0s 273us/sample - loss: 0.8021
Epoch 29/50

34/34 [==============================] - 0s 274us/sample - loss: 0.7469
Epoch 30/50

34/34 [==============================] - 0s 304us/sample - loss: 0.6949
Epoch 31/50

34/34 [==============================] - 0s 307us/sample - loss: 0.6455
Epoch 32/50

34/34 [==============================] - 0s 304us/sample - loss: 0.5991
Epoch 33/50

34/34 [==============================] - 0s 303us/sample - loss: 0.5557
Epoch 34/50

34/34 [==============================] - 0s 281us/sample - loss: 0.5154
Epoch 35/50

34/34 [==============================] - 0s 276us/sample - loss: 0.4778
Epoch 36/50

34/34 [==============================] - 0s 274us/sample - loss: 0.4424
Epoch 37/50

34/34 [==============================] - 0s 274us/sample - loss: 0.4090
Epoch 38/50

34/34 [==============================] - 0s 268us/sample - loss: 0.3780
Epoch 39/50

34/34 [==============================] - 0s 275us/sample - loss: 0.3494
Epoch 40/50

34/34 [==============================] - 0s 273us/sample - loss: 0.3228
Epoch 41/50

34/34 [==============================] - 0s 295us/sample - loss: 0.2985
Epoch 42/50

34/34 [==============================] - 0s 297us/sample - loss: 0.2764
Epoch 43/50

34/34 [==============================] - 0s 301us/sample - loss: 0.2558
Epoch 44/50

34/34 [==============================] - 0s 316us/sample - loss: 0.2367
Epoch 45/50

34/34 [==============================] - 0s 303us/sample - loss: 0.2194
Epoch 46/50

34/34 [==============================] - 0s 288us/sample - loss: 0.2034
Epoch 47/50

34/34 [==============================] - 0s 2ms/sample - loss: 0.1888
Epoch 48/50

34/34 [==============================] - 0s 297us/sample - loss: 0.1755
Epoch 49/50

34/34 [==============================] - 0s 303us/sample - loss: 0.1633
Epoch 50/50

34/34 [==============================] - 0s 285us/sample - loss: 0.1522
This is a sample tettrmorgaangaagenmodNNing gsingaRNN.fodNNingtNsingadNN.nodNsingoNNin adNNinodNNingtdNinRadNNinodNsinm
stderr
WARNING:tensorflow:From /usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/resource_variable_ops.py:435: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.