Natural Language demos for your exploration. Enjoy!

This is a multi-turn bot with a lookback history of around 400 words.

This is based on the OpenAI GPT-2 project, hosted on HuggingFace.co. The ML model is trained on 147 million Reddit discussions. The content isn’t an attempt at accuracy but it does pretty well for a Turing test.

Type reset to clear the 400 word lookback history.

Chatbot Icon

Conversational AI

  • >> You: What’s up?
    Bot: Not much, what about you?
  • >> You: I am curious about AI.
    Bot: I’m curious about you.
  • >> You: What would you like to know?
    Bot: What is your favorite color?
  • >> You: I like green.
    Bot: I like green too.
  • >> You: How many people do you think like green?
    Bot: I think it’s a lot of people.
  • >> You: Can you color?
    Bot: I can’t, but I can do some shading.
  • Question: Who won the men’s NCAA Basketball Championship?
    Answer: Baylor
  • Question: Who did Baylor defeat?
    Answer: Gonzaga
  • Question: Where is Baylor?
    Answer: Waco, Texas
  • Question: What was the score?
    Answer: 86 – 70
  • Question: Who is the Baylor coach?
    Answer: Scott Drew
  • Question: Who is the Gonzaga coach?
    Answer: Mark Few
  • Question: Where was the game played?
    Answer: Lucas Oil Stadium in Indianapolis
  • Question: What is the Baylor team name?
    Answer: Bears
  • Question: What is the Gonzaga team name?
    Answer: Bulldogs
Lengram Icon


A Lengram – Try it out!



A physical engram is a collection of neurons and synapses that casts knowledge in a real form. A logical engram (lengram) casts knowledge in the form of text or code, or a combination of the two. The lengram natural language functionality is based on the Google BERT language model. BERT takes a question and a body of reference text, and returns an answer from the reference text along with a score.

"Baylor Defeats Gonzaga To Win Its 1st Men's NCAA Basketball Championship"