Ahmad Rana's Engineering Log

Projects: G.E.I.S.H.A. 

 

 

 

 

Introduction

Engineering Log

Embedded Talks

Contact

 

 

 

Inventors: Ahmad, Kenn and Sal

How would you like to have an assistant who wakes you up in the morning by singing you a song, gives you a hot-lather shave, makes breakfast for you, irons your clothes, and even scrubs your back while you are taking a bath. We proudly bring you G.E.I.S.H.A. (Guided Essential Intelligence Subattractive Human Assistant). Ok, it cant do all of the above, but tries real hard to pretend that it can. 

Scope:

It has been more than 40 years since the first interactive program for human language processing was written in 1964 [1]. Since then, a lot of progress has been made in Artificial Intelligence and much higher computational power has become available. As a result, todays robots look and act closer than ever to actual human behavior. New, interactive toys are coming out which are replacing pets or even personal companions [2].

The scope of this project is to build an interactive robot which provides useful utilities such as alarm clock, and amusement utilities like interactive conversation and music/video player. It also acts as an interface to the latest technologies being used in Social Networking, such as Twitter, Facebook and Myspace, and puts your latest images on the network by capturing them on the mounted webcam, along with your location and what youre doing, so that your friends are always up to date with your activities. The robot will be able to balance itself on two wheels, based on the same operation as a Segway PT (personal transporter) [3], and follow voice commands.

A 3D rendering of the proposed robot is given in the following figure. It is also an ideal platform to do some serious research in the future by exploring complex control algorithms using fuzzy logic and motion and path planning techniques. But that functionality is secondary. The primary focus is to make an interactive toy.

Specifications:

  1. Autonomous operation through onboard controller and battery power for mobility.

  2. Voice recognition and following of simple audio commands

  3. Realistic facial animation on LCD screen accompanies the interactive voice response from robot

  4. Location awareness through onboard GPS. Gives you the map if you need directions.

  5. Link to Facebook/twitter through WiFi connecting to network, and updates of pictures taken from webcam.

  6. Self balancing control algorithms (classical control problem of stabilizing inverted pendulum) to balance the robot on two wheels using feedback from a 3-axis gyroscope.

  7. Reconfigurable personality, and option to change audio responses from robot through software configuration.

  8. Provides utilities like configurable audio alarm clock and multimedia player.

 

Implementation:

The figure above gives block diagram of G.E.I.S.H.A. Almost all the components in the diagram come from previous projects. For example, RC Servos and ESC (Electronic Speed Controllers), 5-volt switching regulator, 6-axis IMU (3-axis Gyroscope + 3-axis accelerometer+ u-blox GPS), Voice recognition module (bought for a project but never used), Speaker and amplifier, Wireless bridge, Webcam, LiPo battery (chargeable), all come from previous projects. Some components have been donated or borrowed. For example, VGA controller and 6.4 LCD has been borrowed from my brother for this project. VDX-6318 will be provided by Microsoft. Platform/ body frame of the robot was offered by a friend. Facial animation technology developed in the last EmbeddedSpark Contest project will be recycled and ported to Windows Compact 7. Also, control algorithms and software infrastructure developed in a previous project submitted to Embedded Spark contest for an autonomous aerial vehicle will be used to self-balance the robot on two wheels.

References:

[1] ELIZA: http://www.stanford.edu/group/SHR/4-2/text/dialogues.html

[2] Japanese Robot: http://www.kxly.com/news/25389807/detail.html

[3] Segway PT: http://www.segway.com/

Implementation:

 

Originally, Ken had promised that he will make the body of the robot in his workshop, but later it turned out that his workshop consisted of one screw driver, one drill machine (without a drill bit), and a light bulb. I was quite happy with the body as it is, but Ken and Sal insisted that we need a better body for the robot. Otherwise, it would send a wrong signal. So we decided to make a CNC Router. We built the stepper motor drivers (3x NEMA 23 @5volts) using LMD18200 full bridge rectifiers. The controller was developed using Texas Instruments TI MSP430G2231 microcontrollers. The G-code interpreter was implemented on an Arduino. The body was made from MDF. Rest of the components were purchased from Home Depot. We already had some components, but the cost to us was less then $100!!!

The CNC Router turned out to work quite well, but before wed had a chance to use it, we got something better. A local university was selling their old ER1 Robots (From Evolution Robotics). These robots had a real nice software development kit when they came out. Unfortunately, it was made for Windows 98, and then ported to Windows XP. It wont run on WinCE platform. So we decided to interface VDX-6318 SBC directly to the Robot Motion Controller. Upon opening up the Robot Controller, we found out that it uses two MC3410 (one for each axis), some H-bridge ICs for driving the stepper motor windings, and FTDI chip for its USB interface, and some decoding circuitry to route the data to one of the MC3410 stepper motor controllers based on address encoded within the command sent.

 

The documentation for MC3410 is available for free on request from Performance Motion Devices, Inc. Unfortunately, they dont give you sample device drivers, as they sell it as a part of the SDK. So we had to figure out the communication protocol ourselves. After struggling for a while, we were able to figure this out.

Sal was able to make a simple application on WinCE to send a move forward, stop, move backwards, turn left and turn right commands.

The Robot controller is really nice, since it has digital I/O as well as ADCs, which you can access through USB interface. So it has a lot of promise, and were sure that we are going to have a lot of fun using it. Last year, Id made a video interface for the security system that Id developed. So I already have the API running on Windows Embedded Platform. Looking at the progress that weve made, Im quite sure that we will end up with a fine project, once it is completed. Higher level functionality, such as computer vision, motion planning etc will be implemented on the SBC. We also plan to connect the robot to a ground station.

Testing: 

Final: