How Jarvis chooses words

Jarvis writes by predicting which word should come next in a given sequence.

For example, if you asked Jarvis to complete the following sentence,

I needed to pick up some milk so I walked to the nearby ___________

Jarvis will make a list of which words are most likely to come next and assign a percent chance each word has of coming next. His shortlist of words might look like this:

  • convenience - 21% chance
  • store - 15% chance
  • supermarket - 8% chance
  • grocery - 7% chance

Depending on how we have trained Jarvis, he will select one of those words. The higher chance assigned to the word, the more likely he is to choose it.

In this example, let's say Jarvis chooses the word convenience as the next word. Now our sentence looks like this,

I needed to pick up some milk so I walked to the nearby convenience ___________

Now, Jarvis would repeat the process again to decide which word or punctuation mark should come next. Here, his shortlist might look like this:

  • store - 95%
  • shop - 4%
  • . - 1% (this is a period)

He's almost certainly going to write the word "store", but there is a very small chance he ends the sentence with a period and moves on to the next sentence.

Jarvis uses past context to predict content moving forward, one word at a time. What you input to Jarvis greatly impacts the content he will output. It both influences the shortlist of words he's pulling from, and the probability they have of being used.

Writing factually true content

While Jarvis has read much of the internet, the way he is trained to write causes him to put more emphasis on creativity than factuality. Because of that, you'll notice Jarvis regularly writes things that aren't true.

Think of Jarvis as a writing assistant, and not a fact-checker. You'll still need to go through and correct the truthfulness of Jarvis-produced content.

Next up
Patterns & repeating content