October 21, 2011

celegans project introduction

In the previous euglena project, I programmed some simple searching algorithm for the NXT robot to find light. With the celegans project I will go a biotic approach, more specifically I will design a neuron network to control the robot to achieve the same aim (find light) as euglena instead of programming by myself. Here programming I mean the traditional "if...then..." style programming. To design a neural network to realize certain task, generally speaking, is also programming, but I would rather call this kind programming "neuroprogramming". Neuroprogramming is different with conventional programming. It is fundamentally parallel, thus needs a very different programming paradigm. celegans could be a good model to try. The name is from "c. elegans", which is a tiny worm only 1mm long, and it is the simplest intensively-studied model organism with neural system.

I will use 1 light sensor to detect light and two motors for the movement. Therefore at least 3 neurons are needed: 1 sensor neuron, 2 motor neurons. Are three neurons enough? I guess probably not for an efficient light finder. To find the light source the robot should know the change of light intensity. It might be necessary to include more interneurons for this function.

During the past few months I have read quite a few papers in neural science to gain some inspiration from the real neural systems. Some of which are about c. elegans. Even though I expected for c. elegans to survive it should do something similar like finding the light I was still a bit surprised to read that the neural network the real c. elegans used could be directly applied on my NXT robot! The only difference is c.~elegans is more interested chemicals rather than light, so light intensity is replaced by chemical concentration for c. elegans. Other than this everything is almost the same. I found the following facts about c. elegans' chemotaxis:
  • It only uses 1 sensor neuron.
  • Effectively it uses 2  motor neuros to control the left and right side of the muscles.
Neurocircuits it might be using has been proposed, for example Eduardo Izquierdo used a genetical algorithm to find some working minimal neural networks (The Journal of Neuroscience, September 29, 2010 30(39):12908 –12917); I especially like the work by T. C. Ferree et al. Indeed ten years ago, they have done exactly the same thing I want to do now: use the neural network to control the robot to find light! Here is their paper "Robust spatial navigation in a robot inspired by chemotaxis in c. elegans". Therefore as the first step I will simply repeat their work based on NXTCamel. The photo below is their modeling of the car from the paper. NXTCamel is designed to have a equivalent  driving system as their model.


NXTCamel example

I wrote a matlab function checksystem.m to check whether all motors and sensors are working on NXTCamel as I expected (I didn't test the sound and touch sensor in the first version of the function). The program can be downloaded here. Before you running this program you should make sure:
  1. The RWTH Mindstorms NXT Toolbox for matlab is installed correctly and you could connect to NXT within matlab using bluetooth without any error.
  2. All sensors and motors are connected to the same ports as in the building instruction.
  3. The ultrasonic sensor is facing to the right. I define this as the default starting position.
  4. You have enough space (at least 2 by 2 meter) for NXTCamel to run.
If everything works fine, the output in Matlab should look like the following:

>> checksystem
Make sure the ultrasonic sensor faces to the right side before you start.
========Now test driving system========
Drive forward for 3 second. Press any key to start...
Drive backward for 3 second. Press any key to start...
Turn left. Press any key to start...
Turn right. Press any key to start...
Back and turn. Press any key to start...
Back and turn to the other direction. Press any key to start...
========Driving system test finished========
========Now test the sonar system========
Sweep sonar and measure the distance. Press any key to start...
========Sonar test finished========
Light intensity is 42.3%
========All test finished========

It will also plot the distance measured by the ultrasonic sensor in a polar plot like this:


Note that the NXTCamel by definition is always facing the 90 degree (12 o'clock) direction. The number 100, 200, 300 in the plot is the distance in centimeter. I measured 10 points here. One could in principle sample more points and get a better representation of the surrounding environment though it will be slower.  

This distance information is naive compared with those from Google car. Nevertheless it is enough to let NXTCamel drive without hitting obstacles. 

Update: 
Here is an example with a simple algorithm, and I am sure there is still lot of space for improvement. Please let me know if you get a better way.

September 30, 2011

NXTCamel

I spent almost a whole weekend to design and build this platform for my next project "c.elegans" and maybe other future projects. I named it as "NXTCamel" cause it looks bit like a camel with four long legs, and it can carry a lot!






The building instructions can be downloaded here. Some features of NXTCamel are:
  • All sensors are mounted and all motors are used. (Lego Mindstorms NXT 1.0)
  • It is like a car, two back wheels for driving, and two front wheels for controlling the direction.
  • All signal lines can be tightly attached to the body so they won't stuck to somewhere easily while moving around.
  • Ultrasonic sensor can rotate 360 degrees controlled by one motor to measure distance from any angle.
  • The body is flat, spacious and stable. I could easily mount for example a webcam or smart phone on it for more sophisticated projects. 
  • It consists altogether 187 pieces.   

Connect from linux

Bought a very cheap Asus X54L laptop two weeks ago. I installed Ubuntu 11.04 on it. Cause there is no bluetooth adapter on the laptop I bought LogiLink Bluetooth USB-Adapter Class2 EDR V2.0 on Amazon.de. It only costs about 3 Euros and works well with Lego NXT.

I basically followed RWTH Mindstorms NXT toolbox instructions to successfully connect to Lego NXT without much difficulty. The only problem is for some unknown reason I need to connect twice with btconnect command provided by the toolbox to let Matlab recognize NXT.

April 23, 2011

Random Light Finder (Matlab version)

Today I found some time to play with the Lego NXT, and I wrote the matlab version of the random light finder. The matlab program is shown below. It works quite well. With matlab I can now easily implement more sophisticated algorithm.

The robot can now effectively avoid obstacles, and I don't need to help it get out of somewhere from time to time. The program can also plot the light sensor data while the robot moves around. It is very slow to find the light because it's a random walker just like the Euglena.

The code is here.

Most of the code are straight forward, there is only one line I need to put some more words.

abs(lightdata(step)-lightdata(step-1))<5 means the light intensity almost unchanged compare to the last step. This could because the Robot get stuck somewhere and can't move forward. It also could because it goes to a dark regime where the light is always very low therefore hardly changes. The robot should avoid both the situations, so I let the robot go backwards a little bit in these two cases.


December 23, 2010

Finally I can use matlab to control Lego NXT on my macbook

I got a license for matlab on my Macbook pro and I test to control nxt with RWTH Aachen MINDSTORMS NXT Toolbox. With bluetooth it works very well though I haven't succeed to use USB.  Anyway, bluetooth is all I need, so that's alright for me. One thing I should note is that in the installation instruction( http://www.mindstorms.rwth-aachen.de/trac/wiki/Download4.04 ) there is a small mistake: only after you execute
COM_OpenNXT('bluetooth.ini')

you get the sign on NXT change from B< to B< >

I will experiment with the power of Matlab from now on!

November 20, 2010

Automatic design of Lego NXT robots

Building Lego NXT according to your imagination is lots of fun, but to figure out some really cool stuff is not easy. I was wondering whether there are some ways to automatically design Lego robots (in this blog entry I only concern about hardware) like those windows screen saver generating random forms of artificial life. After some research I found no one has done it yet. However, there are already lots of resources which maybe already enough for a group of very talented people to figure out how to design a Lego robot automatically. I will summarize what I have found in the following.

Background

I think the most promising way to automatically design a Lego robots is to follow the nature, namely using evolutionary mechanism. One uses Genetic Algorithm (GA) to search for a design fit to the need. This field of study is called "Evolving Hardware" or "Evolvable Hardware" (EHW).  This field is still in its preliminary stage. There are now two major directions in this field. The first one is evolvable electronics, which can be either digital or analog circus.   For digital circus, one can first design on a computer  using GA then test it with FPGA or build the evolutionary mechanism directly on the FPGA which can then achieve some adaptive advantage. Search "FPGA" with "Evolvable Hardware" on Google if you want to know about it. For analog circus there are fewer research though I think its also  a very promising direction. Here is a group in NASA who are doing this kind of research.

Another direction of EHW research is the evolutionary machines (objects). It is generally easier to simulate and test electronics than machines, therefore much fewer research can be found in this direction. The Golem project at Cornell is probably one of the most earliest systematic try. A successful story to design antenna using GA can be found here. To automatically design a Lego NXT robots lyes in this direction.

Outline

I don't think I can make an automatic design of NXT work by myself. It needs lots of work and has a high risk to be not accomplished after years of dedicated work.  However, I still list what I will do if I work on this project here. When I find more people I might start it.

Step 1 Map the Lego NXT robot's structure to a DNA like one dimensional representation

Probably one can label different types of the basic Lego building blocks with different id and parameters. I should choose a way of mapping so that the code itself could reflect the actual similarity and difference of the bricks. For example in the Lego NXT kit there are bars with different geometry, length and different types of holes. Let's say we have 4 bars. They are:

No.1 Straight bar with 4 round holes.

No.2 Straight bar with 3 round hole in a row and the last hole is a "+" hole. (If you have a Lego NXT, you know what I mean.)

No.3 Straight bar with 5 round holes.

No.4 L shape bar with one arm like No.2 and the seconde arm is a round hole.

If I name them as "A", "B", "C", "D", both computer and people know nothing about the relation between them. That's a bad code.  A better way to call them is to use these names:

BARoooo

BARooox

BARooooo

BARolooox

BAR is the name of the type. I think to use 3 capital letter for the type of bricks is a good option so you have enough variations to go beyond Lego NXT bricks.

I can either give each bar a unique code like "A" for a straight bar with length 4 holes one it and "B" a straight  bar with 5 holes on it or better call them "BAR4" and "BAR5"

(to be continued)