Connect NGW100 Linux (AVR32) to Arduino with I2C

For a Linux embedded training, I need to find something fun to plug on an AVR32 board. The board used for the training is a NGW100 board. I decided to try to use the I2C bus. I have several I2C stuff on the desk, but nothing really fun and I don’t want to solder a lot of stuff, and spend a lot of time on this. I decided to use a Arduino board as a I2C slave device and plug this to the NGW100 board.

First step: Simply use the Arduino IDE and the Wire lib. I use an Seeeduino because I need an Arduino that works at 3.3v level. Here the code :

#include <Wire.h>
void setup()
  Wire.begin(4);                // join i2c bus with address #4
  Wire.onReceive(receiveEvent); // register event
  Wire.onRequest(requestEvent); // register event
  Serial.begin(38400);           // start serial for output
  Serial.println("Boot Ok");

void loop()
{ delay(100); }

void receiveEvent(int howMany)
  char c = NULL;
  while(Wire.available()) // loop through all
    c = Wire.receive(); // receive byte as a character
    Serial.print(c);         // print the character

void requestEvent()
  Wire.send("Hello world from Arduino");

To test the Arduino I used a Bus Pirate, this is quite simple and fun, here a little snipset of my initial test with the BP (note the string are different). The I2C slave is at the 0×4 address (check the setup()).

Searching I2C address space. Found devices at:
0x08(0x04 W) 0x09(0x04 R) 

Read content from the device, 'ABCD'
I2C>[0x09 rrrrrr]
READ: 0x41
READ:  ACK 0x42
READ:  ACK 0x43
READ:  ACK 0x44

send content to the device 'ABC'
I2C>[0x08 0x41 0x42 0x43]

Second step: Plug the Arduino to the NGW100. I used the wrapping technique. Simply connect SDA, SCL, and GND. (NGW100 pinouts : SDA=>9, SCL=>10, GND=>2)

On Linux, load the I2C-GPIO kernel module. On OpenWRT (used on the NGW100), simply load the kmod-i2c-gpio package.

Final step: If everything is Ok, we can now test the communication. I used a small piece of C code to deal with I2C on Linux.

#include <string.h>
#include <stdio.h>
#include <stdlib.h>
#include <linux/i2c-dev.h>
#include <sys/ioctl.h>
#include <fcntl.h>

void i2c_run(void) {
    int file;
    char filename[40];
    int addr = 0x4;
    char buf[32] = {0};
    int i;

    if ((file = open(filename,O_RDWR)) < 0) {
        printf("Failed to open the bus.");

    if (ioctl(file,I2C_SLAVE,addr) < 0) {
        printf("Failed to acquire bus access and/or talk to slave.\n");

    i = 24;
    // read I2C
    if (read(file,buf,i) != i) {
      printf("Failed to read from the i2c bus.\n");
    } else {
      printf("Read %d bytes from I2C: [%s]\n",i,buf);

    sprintf(buf,"IC2 from Linux to Arduino");
    i = strlen(buf);
    if (write(file,buf,i) != i) {
        printf("Failed to write to the i2c bus.\n");
    } else {
      printf("Sent %d bytes to I2C: [%s]\n",i,buf);

int main()
  return 0;

As you can see this code is a bit rude, but works really well : Read the I2C bus, and send a sample string, a proof ? :)

Of course, I used string values but in real life a small protocol shoud be used. Another important thing: I used a NGW100 but you can use the same idea on all Linux embedded board like the Fonera, or anything else.

Update: Of course you can use the i2c-tools on Linux to detect your own device. To do that : Grab the i2c-tools source, and cross compile it for the AVR32. (You only have to change the CC path in the Makefile).

/Enjoy small Linux

Calling GDB (Gnu debugger) within your code.

I spent a lot of time finding the right way to do this, so here is a quick note. For work, I need to debug a small piece of code I wrote. But the main software (closed source) load my library (via dlopen) and run it in a single thread.

That’s fine, but I’m unable to debug this part of the code because I’m unable to place a breakpoint. So I want to place a “break” in my code that’s return back to GDB. (exactly like you can launch the debugger within Python w/ a simple pdb.set_trace().

This is quite easy to do, and really useful but not really popular. The main trick is to call the interrupt number 3. This will rise a SIGTRAP signal in the Linux kernel that GDB can intercept .. so in your code simply add a macro :

#define GDB()  asm("int $0x3")
int main()
 int a = 12;
 printf("A ==> %u \n",a);
 return 0;   

Call the GDB(); macro in your code, compile it with gcc -g, simply run your program in GDB, and wait for the macro to be called :)

/Enjoy gdb

Mbed Ethernet connection

I just received my mbed module. This little ARM device is pretty cool, and the associated tools works quite nice. Of course they cost a lot of money, I received mine for free for the mbed contest. Really kool no ? ;)

After the classic blink test, I decided to go for a network test. But I don’t have any magnetic Ethernet module right now (In fact, I should have one, but I’m unable to find it). So let’s go for a magnetic less ! The doc on the mbed dedicated page say that should be fine. I decided to pull the RJ45 socket from an old broken WRT54.

The main issue is to figure out how to solder this RJ45 on a veroboard. Here comes the fun part, I remembered that radio-amateur use a technique called “dead bug soldering”. Check this guidelines from the NASA for examples.

I decided to give it a try :

Just glue the RJ45 on the veroboard and use a common wrapping technique : Not so bad ;)

The next step is to flash a network example to test.

That’s really fun, the mbed works pretty well. I secretly hopes that somebody will come with a mbed like with opensource hardware and software.

Enjoy wired networks ;)

Really cheap USB to TTL on Atmega : 1.70$

One of the most common way to interface a microcontroler to a computer used to be serial port. But right now, serial port have been replaced with USB on most computers. A common way to fix this issue is to use a USB to TTL converter or a USB to RS232 converter + MAX232. That’s fine but :

  • USB to TTL PCB cost a bit of money : you can find some on Ebay around 7€ (shipped) and 15$ on Sparfun !!!  That’s about 2 or 5 times the cost of the microcontoler !
  • USB to RS232 cost 1.70$ (shipped) but need some extra level shifting and doesn’t really feet on a PCB (need a DB9 connector …)

In fact, USB to RS232 is a mass product, and the cost is really low. I decided to order a couple of this, just to look if I can use this stuff on a PCB. So I bought a 1.70$ USB to RS232 on Ebay.

I decided to rip the plastic off the DB9 and discovered a really tiny PCB. I removed the DB9, and decided to pass this little PCB to a scope session. How the hell do they manage to do a USB to RS232 with only a couple of external components ? They is no big capacitor for level shifter  (remember  RS232 is a +12/-12v ) ? The answer is simple, they don’t !!

This device isn’t RS232 compliant at all, the signals on the DB9 are TTL compliant, but not RS232. The ouput is between 0/5V and the input can handle -12/+12V but works great with a 0/5V too. I simply removed used pads on one side and added a couple on pins.

Please note that RX pin is missing on this pix but needed of course. The next step : How can I use this with an AVR Atmega (I used a Atmega8 but any will do the trick). Serial connection on a micro is TTL like this board, but the TTL signal is just inverted. A “1″ on the RS232 side is a -12V and +5V on a TTL, and a 0 on the RS232 side is a + 12V and a 0v on the TTL. You can find all the information here.

In fact MAX232 do both level shitting and inverting, but as I’m to lazy to wire a MAX232 (and will destroy the cheap aspect of this hack), I decided to handle this by software. This mean, I won’t be able to use the Atmega serial builtin port but need to write some additional code, to do the RS232 encoding/decoding by hand. Let’s give it a try :

I simply put this on a verroboard, connect VCC to USB Vcc, GND, RX and TX  to random pins on the AVR and let’s go to RS232 software serial. This can be done easily in fact, and I managed to handle 19200bauds with the internal 8Mhz clock of the Atmega. Above you will find the popular uart_putc() and uart_getc() ..

 1 #define UART_TX	D,1
 2 #define UART_RX	D,2
 3 #define UART_DELAY	52 // 1/9600 = 104uS : 1/19200 = 52uS
 6 void uart_putc(char c)
 7 {
 8   uchar i;
 9   uchar temp;
11   // start
12   set_output(UART_TX);
13   _delay_us(UART_DELAY);
14   clr_output(UART_TX);
16   for(i=0;i<8;i++)
17   {
18     temp = c&1;
19     if (temp==0)
20       set_output(UART_TX);
21     else
22       clr_output(UART_TX);
23     _delay_us(UART_DELAY);
25      c = c >>1;
26   }
28   // stop
29   set_output(UART_TX);
30   _delay_us(UART_DELAY);
31   clr_output(UART_TX);
33   _delay_us(UART_DELAY);
34 }
36 uchar uart_getc()
37 {
38   uchar i;
39   uchar ib = 0;
40   uchar currentChar=0;
42   while (ib != 1)
43     ib = get_input(UART_RX);
45   _delay_us(UART_DELAY/2); // middle of the start bit
46   for(i=0;i<8;i++)
47     {
48       _delay_us(UART_DELAY);
49       ib = get_input(UART_RX);
51       if (ib ==0)
52 	currentChar |= 1<<i; // this is a 1
53     }
54   return currentChar;
55 }

Nothing more to say, this hack works really great, and I can now build a bunch of USB board without paying so much. The only drawback of this approach is that you can’t use an interrupt for the uart_getc() so you have deal with that in your code. Another approach would use a single transistor for the RX pin to make the RX compliant w/ the AVR serial builtin routine.

You can find the whole project C files + Makefile in a zip here. I think this little hack is really useful, so please send it to all to your DIYer friends, this can save them money, time …

// Enjoy cheap USB ? :)

Boosting IR remote video sender (Thomson VS360U)

In my home, I have a bad TV antenna, so we use only the cable receiver to watch TV. But I have two TV sets. I decided to buy a video sender a couple of months ago, but never managed to get it working nicely. I bought a Thomson VS360U video sender. This one is really cheap, 24 €, works on the 2.4Ghz for the audio/video and 433Mhz for remote.

At the first test, I discovered that the transceiver come with a couple of IR leds. I have to glue each IR led in front of each part of your equipment I want to drive. For me, the cable receiver, the DVD, the Dvico, and the AV amp .. I tried this, but that’s a mess, each led is soldered on a single string, and tend to move. Not really a nice experience. This is simple to crappy to be use.

I decided to mod it to be able to use a single IR led, with a better gain. The first step is to find the right place to place my mod. Just open the transceiver, locate the power supply (Vcc/Gnd) and the IR transistor. I was quite easy, the only trick is to solder the wire for the IR transistor just before the base resistor. Here is the result.

You can find a better pix, in the gallery. I used a scope to find the IR transistor, but this can be done without.

Let’s build a simple IR booster, that’s connect to this pins, and everything will be fine. I used an common BC547 but any common transistor will do the job.

The result :

As you can see, this is small. I placed this near my cable receiver and every is working nicely. I can now control every equipment (cable, DVD, Dvico) for my room without any lag, or IR lost signal.

I managed to fix this cheap video sender without to much effort, I’m happy. This kind of hack can be used in a couple video sender device. The hardest part is to find the IR transistor, the rest is simply the same.

Enjoy TV from bed ;)

From Python to Vala for 1wire monitoring w/ Munin

Recently I decided to daily switch my main computer off. This computer was usually on all time, and consume a lot of electricity. So, I switched to a really small computer for common task: ssh-server, wake on lan (for my main computer), VPN access and mail relay. This new computer consume 7watts but his specs are : Geode CPU at 300Mhz, 128Mb of RAM, and 40Go of HD. Yes, that’s really low, but far enough for attributed tasks. I randomly log on this for external to access all computer inside my home network.

The main issue here, is that I used my main computer to monitor an 1wire network of external, heating and rooms temperature. I used a small Arduino card and a couple of Python scripts to populate some munin graph, like this one:

As you can see on this graph, I use a reference temperature from Guipavas. This stuff is public, and I use the for the info. All works fine for about an year now. But when I switched to my new little box (300Mhz..) the python script used to monitor the 1wire network and gather reference was a bit heavier than excepted for this little box.

I first thought to rewrite this in pure C, but having to deal w/ xml parsing (libxml) and Posix serial in C .. That’s the little story, I decided to rewrite this script (and other) in Vala. I will not dump the Vala introduction here, but to be short it’s a new language that produce C used by Gnome Desktop. The syntax tend to be a C# like, and it has a lot of libraries and doesn’t need the bloat of an interpreter (nor VM). My first test was to listen to the Arduino serial port.

public void run()
ser = new Serial.POSIX();
loop = new GLib.MainLoop();

I used a Serial.vala wrapper found on the net, this is simple and neat. Just added some string parsing, and I get my Arduino 1wire network working w/ Vala .. The next is the parsing which will be covered in a future post.
To conclude, the Vala result is fine. The result binary is small 38KB, it has quite a lot of dependencies (libsoup,glib, pthread, gobject..) and consume more memory than my python script. Python interpreter + Elementtree (xml parsing) + pyserial eat around 8.9MB of RAM, while the my Vala code eat 12.3MB. But keep in mind that’s this is with all the shared libraries. So, if you use a couple of script like me, this memory isn’t a big deal, because it will be used across different process without any overhead.

In meantimes, the main difference between the two version is the speed, here come some results with the time command of the functions only (I dropped the serial IO stuff for this test) :

jkx@brick:~$ time python
Temp:    20
Pres:    1021.0 hPa
Wind:    19 km/s

real    0m2.105s
user    0m1.468s
sys     0m0.216s
jkx@brick:~$ time ./weather
Temp:    20 deg
Pres:    1021.0 hPa
Wind:    19 km/s

real    0m0.427s
user    0m0.084s
sys     0m0.032s

Ok, Python takes 4x the Vala time for the same stuff. Of course this piece of code isn’t exactly the same, and evolve an network access, but I tested this a couple of times, and the result is always ~ the same, so I decide to look closer, and found that despite Python interpreter load quite speedy, ElementTree + urllib2 take 1.35sec to import

I get it, this system has a really small CPU and importing libs from harddrive takes times .. which doesn’t occur with my Vala code, the binary is small, and all dependency are already loaded by the OS itself. To conclude, Python is still my favorite language but running python script on small system has an overhead which I must take care, and avoiding loading / unloading libs is the key. A single python process, with some script loaded will be a better choice. And for small custom apps used on this kind of system, Vala seems to be a good alternative.

// Enjoy the sun

Disable HAL in Xorg on Debian / Ubuntu

Ok, let’s go for another big issue on the road to build a complex distro .. Maintainers tend to include one feature after one .. and now Debian is getting closer to bloat ..

Anyway, sometime ago the HAL was introduced in Xorg. This allow you to hotplug mouse / keyboard … But if for a reason, your HAL is buggy .. you can’t use a keyboard or a mice in Xorg. That’s a bullshit ! I discover a bug in RAID + HAL, and HAL is now segfaulting on my computer .. so I need to get ride of this Xorg / HAL …

First you must modify /etc/X11/Xorg.conf with something like this :

Section "ServerFlags"
    Option "AutoAddDevices" "False"
    Option "AllowEmptyInput" "False"

This disable the hal support, but if you want to have the keyboard and mice, you must install the following packages :

  • xserver-xorg-input-kbd
  • xserver-xorg-input-mouse

That’s it… no HAL support Xorg anymore, that works fine …

Howto resize a libvirt (kvm/qemu) disk image

I’m using kvm for a while at work. Everything works quite fine, but today I needed to grow a disk image. I found some informations, but none are really clear so here the result :

First create a empty image file .. with this command (don’t use dd,  qemu-img is really quicker than dd):

qemu-img create -f raw temp.img 10G

Next simply your image file + the temp one, in a biggest one ..

cat foo.img temp.img > bar.img

You will get a new image file which is 10G bigger than the original one .. Now you can boot your OS, and discover (via cfdisk for example), that your system has a additionnal 10G unused space .. So next step:

  • Just create a new partition, and mount it in the normal way
  • Boot your kvm OS from a ISO file containing Gparted

I tried the second approach, and used a ubuntu install to boot (using virt-manager, this is really easy to do). And resized the partition to my need .. simply reboot and “tada” :)

Enjoy disk ?

Howto use AVR Dragon Jtag on Linux (Avarice + avr-gdb +DDD)

I bought a couple of months ago a little AVR Dragon card. My initial plan was to use it for debuging programs with the embbeded JTAG. But I run into several issue with that, mainly because the lack of doc on this topic. So, here we are ;)

The AVR Dragon is nice because you can use it as a small developpement device without any other requirement: Simply drop the needed ATMega on the board, some little wrapping for : Jtag + power supply.

As you can see, this is compact and nothing else is needed. The power supply come from the USB port, and I soldered a DIP on the board.. and that’s it.

I use the Jtag connector, so now I can use a real debugger instead of playing with the UART. Simply put a breakpoint, and enjoy :) By this way, I figure out that most of the time I simply push some stuff in arrays, and inspect them with debugger. This is really efficient. For example, last week I need to fix a timing issue with a IR sensor, simply wrap the little board, and push all interrupts in a array with the related timing. Of course, this can be done with a serial connection too, but it will take more time, and even worst if you encounter a bug, you will have to find where is it (the UART printf, or the code itself) ..

So, how to use this with a Linux OS ?

First you need to use AVaRICE to program the ATMega with a command like this :

avarice -g -j usb --erase --program --file main.hex :4242

Here the result:

AVaRICE flash the hex file to the ATMega, and wait for a GDB connection on port 4242. GDB is fine, but not really visual ;)

Let’s take a look at DDD

To use DDD with avr-gdb (the gdb for AVR), you need to edit a config file, for example gdb.conf and put this in :

file main.out
target remote localhost:4242

And the final command, just launch DDD like this :

ddd --debugger "avr-gdb -x gdb.conf"

Next step: Simply place some breakpoint, and the press “Cont” inue button in DDD. Et voilà :

I hope this little tuto will help people looking for a nice AVR debuger for the AVR on Linux (or any OSS system). The AVR Dragon is definitively a must have for low budget user in AVR scene.

Enjoy bug ? :)

Nvidia 173.14 xrender benchmark

In a previous post, I looked closely the way nvidia binary driver works. In fact, like a lot of users I run into issues with firefox and other software which use Xrender extension to display stuff. A couple of day ago, Nvidia released a new version of its driver. They claim the future version fix the Xrender lag, so I decided to run it toward my previous bench results to see if current version change anything.

So the configuration is the same:

  • Nividia 173.14.12 kernel 2.6.24 and a Q6600

First, I need to say that in the default setting the new driver doesn’t work really nicely. It’s look even slower than previous in the default configuration. So for the first time on this bench serie, I tweaked the InitialPixmapPlacement and set it to 2. In my previous bench batch, doing this tweak products bad result so I disabled this option, but this time the drivers is so slow that without this tweak the benchmark would be useless.

Ok, let’s go for the results:

First, we can see clearly the new version is really better on some points : PictOpClear is the best result. We can see the nvidia team has really work on this, and the result even outperform the ATI driver. On the other side the PicOpt[Con|Dis]jointClear is still very hight.

For the rest of the test :

To things, on quite all the result the new driver is slower than the previous on (perhaps this is a InitialPixmap side effect), but the difference isn’t really big 0.5 sec on a test which is far from 0.5 at the end.. And the ATI still outperforms clearly the Nvidia here. In fact Nvidia driver’s team claim this primitives are never used (or should be). From what I know right now, some software use this primitive. It’s look like KDE (via QT) do. Apparently Nvidia team asked the KDE dev to change their code to achieve better result on Nvidia cards … Anyways this is perhaps not the best way, but we need to wait for KDE dev answer before going foward.

The second important thing is that PictOpConjointXor has now a 0 result.

As you can see on this benchmark, the new Nvidia driver seems to perform better than the previous one. On the user perspective, it’s look like the fixes applied for PictOpClear (and perhaps PictOpConjointXor) produce some great results. Right now Firefox perform nicely, and the whole desktop is fine. I’m quite sure their is still room for improvements (look at the open source Intel driver results for PictOpOver PictOpIn…, you will see binary drivers are far from the OSS results), but this release is the first for the 8xxxx serie which perform at a decent speed, and this is a good new.

Thanks again to my friends who send me their own results to compare, and to people on various forum that helped me on this stuff.