Skip to content
Cameron Taylor edited this page Aug 15, 2017 · 9 revisions

alan ai

Born: 02/18/16

A smart chat bot and ai assistant written in python. The goal is to create a solid language processing unit at the core and allow users to hack alan and add "actions" for alan to complete based on verbs.

actions

Alan has actions that can be automatically called by imperative sentences. If you tell alan to "show me Jimi Hendrix" he will open up a picture of jimi hendrix in a web browser, "show" is the verb in this imperative. Another example is "send an email" where send is the verb.

You can easily add an action in actions/actions.py all you have to do is write a self contained python function that does something and put and entry in to the "actions dictionary" and the "pick action" dispatcher will call it if alan hears the verb. The entry needs the verb as the key and the function as the value. At the end of your function return a string as Alan's response and he will speak the string out loud. In the code below he would say "Email sent" after sending an email.

For Example: If this is the original actions.py


def send_email(sentence):
  #Logic to send an email
  return "Email sent."

actions_dictionary = {
  "send": send_an_email
}
  

You can add your own action by adding your function and updating the actions dictionary. The new file would look like so:


def send_email(sentence):
  #Logic to send an email
  return "Email sent."

def new_action_think(sentence):
  import some_stuff
  #New amazing logic
  return "I just figured out the meaning of life"
  
actions_dictionary = {
  "send": send_an_email,
  "think": new_action_think,
}

environment

The environment.system script is responsible for spawning new subprocess threads and adding them in to the a list called services. Each plugin gets its own subprocess for example.

language

NLTK is the basis of all the language parsing done with alan. Alan breaks all sentences down in to interrogative, declarative and imperative sentences. Imperative will pass off to actions, interrogative will start the wikipedia parser and the declarative sentences will eventually be tied to some small talk module. At this time NLTK is not used extensively other than pos tagging and will need to be used for more complex features.

learning

Alan can learn how to do new tasks and save those tasks to be run at a later time. To teach alan a task say "learn how to" or "learn how to respond to" and the name of the task or key phrase. The learning module parses english speech and turns it into python code. The following is an excerpt of teaching alan how to "wake me up".

>>> learn how to wake me up

How do I wake me up

>>> give me the time and say wake up and give me the time and say wake up and say here is the weather and find the weather in san jose california and play me some music

Here is what I have:  give me the time and say wake up and give me the time and say wake up and  say here is the weather and find the weather in san jose california and play me some music
Is this correct?

>>> yes
Is that all?

>>> yes

I'll try to do that now.
import alan
alan.speak(alan.think("give me the time"))
alan.speak(" wake up")
alan.speak(alan.think("give me the time"))
alan.speak(" wake up")
alan.speak(" here is the weather")
alan.speak(alan.think("find the weather in san jose california"))
alan.speak(alan.think("play me some music"))

The time is 07:29 PM
 wake up
The time is 07:29 PM
 wake up
 here is the weather
temperature | 79 °F  (wind chill: 79 °F)
conditions | overcast
relative humidity | 21%  (dew point: 36 °F)
wind speed | 8.1 mph
(41 minutes ago)between 59 °F and 83 °F
clear (all day)
Playing music from pandora

Should I remember how to do this?

>>> yes
Learned to wake me up

As seen, alan gets instructions and turns them into python code and passes them through alan.speak or alan.think when applicable. Alan then executes the code that is generated and if it is successfully executed asks the user if they want him to remember. If the user answers yes then when the user says "wake me up" to alan in the future the code generated will be executed.

memory

Alan's short and long term memory are managed here. The various scripts here are for remembering and forgetting things. Alan's long term memory is stored in a sqlite database called memories.sqlite . Short term memory is stored in system memory and is not persistent to disk.

plugins

Plugins are third party applications that can be started and managed by alan. Plugins can be written in a number of programming languages and communicate with alan through stdin and stdout. Through stdout the plugin program can tell alan to speak input by adding ":speak:" to the output or get input from alan by added ":listen:" to the output. More commands will be added as alan is developed.

The following are a few examples of plugins:

Filepath: alan/plugins/echo/echo.sh

#!/usr/bin/env bash
echo ":speak:This is an dummy example of a bash plugin"
echo ":speak:This is a example of both speak and listen. Type something and I will echo it back"
echo ":listen:"
read alan
echo ":speak: you wrote $alan"

Filepath: alan/plugins/fibonacci/fibonacci.cpp

#include<iostream>

using namespace std;

int main()
{
   int n, c, first = 0, second = 1, next;

   cout << ":speak:Enter the number of terms of Fibonacci series you want" << endl;
   cout << ":listen:" << endl;
   cin >> n;

   cout << ":speak:First " << n << " terms of Fibonacci series are :- " << endl;

   for ( c = 0 ; c < n ; c++ )
   {
      if ( c <= 1 )
         next = c;
      else
      {
         next = first + second;
         first = second;
         second = next;
      }
      cout << ":speak:" << next << endl;
   }

   return 0;
}

Filepath: alan/plugins/time/time.rb

#!/usr/bin/ruby -w

puts ":speak:The current time is " + Time.now.strftime("%I:%M %p")

Alan is calls the code from the os and communicates through piping so as stated before and shown in these examples plugins can be written in a variety of languages.

relationships

In the process of being built. Alan can recognize faces using facial recognition and will eventually use those faces to allow multiple people to use one alan. For example my shopping list might be different from my girlfriend's so we don't want alan to mix those up. Relationship information will be stored in a separate database than memories. This is done so users can easily share the memories.sqlite database files with each other and not supply personal information. You may want to swap a memories database that has been trained to do many tasks but keep your alan's data about you intact.

senses

Connection with the outside world. Speech_Recognition module if used for the ears and the eyes will be implemented eventually with opencv. The goal is to tie senses in to relationships so some kind of recognition can take place where alan recognizes users and can manage relationships automatically.

Clone this wiki locally