Introducing Evie

Evie (full name Evelyn Bleurgh) is a new addition to the Toolkit, and will be available to users and developers in the coming weeks. She is a Markov Chain generator, comprising of two separate functions; a 'Read' function, and a 'Chain' function, which is how she writes. First of all, let's give a summary of how Evie works by looking at her two parts together.

When Evie reads, she creates lists of words that follow other words. She looks up the word behind the word she's currently reading in her own dictionary, and adds a point to the score of the word she's currently reading. She can repeat this process with the word preceding that, and so-on, to gain a better understanding of context.

When she writes, she simply starts by picking a random word from her dictionary. She then uses the scores she's totted up in the Read stage to judge what the most likely next word is. As in the read stage, she may use many words prior to the most recently-written word to help her choose the next word.

Key Concepts

Body of Text

Although I have mostly trained Evie on English novels and poetry, she would work with any language or format, provided it is given in text, and observes Evie's definition of a word. If given a text file which represented a musical score, and the notes formatted as 'words', she could write a symphony. Her ability to write depends entirely on the body of text she has been given to learn.


Evie's definition of a word is different to yours and mine. In Evie's eyes, a word is any line of characters with a space character at either end. Therefore, all of these are distinct words to Evie:

  •  There
  •  there
  •  THERE
  •  There,
  •  there.
  •  "There,
  •  there's

And so-on. She has no understanding of the meaning of the word there, nor its place in a sentence. All she has is a knowledge of how likely any of these individual words are to follow any other given word in the bodies of text she has read. Her use of punctuation, too, is subject entirely to the use of punctuation used by the author of the text she has read.


'Depth' is how I describe Evie's level of understanding of any given text, or the required inference from context to use as she writes. When she reads, the reads at or to a given 'Depth', which is to say: for every word she reads, she logs the score to the dictionary of the word preceding the current by a distance of [Depth].

So, at Depth 1, Evie looks at the word directly behind the current word. At Depth 2, she looks at the word before that. At Depth 10 she looks up the word 10 words behind the current word she is reading. By adding 'Depth' to Evie's understanding, she gains a better sense of the appropriate context for a given word.

Moving Forward

Now that Evie is functioning smoothly, I am working on improving her reliability and speed before releasing her for users and developers working wit the Toolkit, as well as better ways of saving her body of knowledge in its entirety so that developers can include a pre-trained AI in their projects, or hobbyists may maintain an AI through the sandbox. 

In the meantime, Evie has a few sonnets she'd like to share with you:

 So thou this the day, 
 No love that love to every where. 
 Some in me thy worth the gentle day, 
 Now all of time of fair in it in you in thee my sight, 
 By that a death my heart's and that you of her heart and thou thy will, 
 Even in it thy worst to some other to your true in it is thy dear love in you with all thy precious time of such a tomb 
 Mine eye my nature to fair that thy power 
 No love that I love with
 That to a far I not in me that thy good report. 
 Than in his own sweet to thee to thee to me, 
 By that a thing to love and the praise of thee thy heart, 
 Yet in his thoughts of your self the spring, 
 Why of me I with thee and thou not thy dear love I have I love,
 And with me that your beauty of his own sweet to have thy poor as thy worth 
 Than in his fair that thy gentle day, 
 Who all thy lovely and that is thee

Get Level Generator Toolkit

Buy Now$5.00 USD or more


Log in with to leave a comment.

(1 edit)

that's awesome how much training data did evie have to read before it could produce that sonnet? also was that written using construct 2?

Yes, Evie was written in Construct 2 - and her entire source will be available as part of the Toolkit very soon! To produce those sonnets, Evie read Shakespeare's sonnets (I believe there are around 140 of them?) about ten times - which means for each word she has a record of the frequency of all words preceding it up to the 10th precedent.

whats the status of evie update?

Apologies for delays! My job is seasonal and things got ahead of me - we're out of season now, and the Evie update goes live tonight!

awesome thanks for the update!