My Pi Description

My Experiences With the Raspberry Pi -- Tracking My Learning -- My Pi Projects

Monday, August 19, 2013

Gertboard - Camera Remote Control - Arduino Sketch #2 - Motion Detector

Continued from last post
This final version of the camera remote control project integrates the motion detector and removes the user input. This motion detector is properly called a PIR (Passive Infrared) sensor. It detects heat given off by bodies around it. The sensor is actually split in half. Motion is detected by there being a slight difference in the heat measured by the two halves. An IC mounted on the printed circuit board processes the sensor signals and gives a digital output that we pass to the microcontroller on the Gertboard.
Here is a very informative link about these sensors My sensor is slightly different than the one in that last link. Mine looks exactly like the one you see on Adafruit's web page. It has the advantage of two potentiometers, one to set the sensitivity, the other to set a time delay between the time motion is detected, and the time the output goes to a logic high. I spent a bit of time adjusting the sensitivity. The signal from the motion detector has an LED in the path on the Gertboard (see the block diagram in the introductory post of this project). This LED gives feedback that makes it easier to set the sensitivity.
Here is the new code:
I don't think it's necessary to discuss the code further, we covered it pretty well in the last post. The changes should be pretty easy to understand.
I would like to mention that this version does not require the Gertboard, or the Raspberry PI. Once the ATmega device is programmed, it will not lose the program even if it is lifted off the Gertboard and placed on another PCB. Everything can be bundled on a small printed circuit board like Adafruit's Perma-Proto Half-sized Breadboard PCB and placed in a small box with the motion sensor and the RF transmitter. Connect 5V and away you go.

Saturday, August 17, 2013

Gertboard - Camera Remote Control - Arduino Sketch #1

Continued from last post
I have covered the project scope, the hardware involved, and the digital patterns we have to reproduce. What's left is the code to make it all happen. I have two sketches, Arduino parlance for source code. The first was written without using the motion detector. Instead, the user is prompted to input an "a" or "b" at the keyboard. An "a" transmits the code to take a picture. Pressing "b" is like pushing the camera exposure button halfway down. The second sketch, presented in the next post, incorporates the motion detector and eliminates the keyboard input.
I am using the Arduino IDE (Integrated Development Environment) to write the sketch, compile the sketch, verify the C code, and upload the machine code to the microcontroller. The only IDE function we cannot use is its serial monitor which provides screen output and keyboard input. The IDE's serial monitor requires a USB connection (as would be used with an actual Arduino product). The Gertboard has no USB. If we wish to send and receive data from the microcontroller, we need an alternative.
That alternative is a terminal program called Minicom. It's like the old DOS Telex program used to talk over the modems we used in the old days. The Gertboard User Manual talks us through its installation and configuration. Minicom communicates through the UART(Universal Asynchronous Receiver/Transmitter) port. This two pin port is signified by the pins TX and RX, both on the microcontroller and the Pi/Gertboard. See the block diagram two posts ago. TX of the Pi is connected to RX of the microconroller, while RX of the Pi connects to TX of the microcontroller. The stdio program we usually use for the Raspberry Pi is LXTerminal. Even though it has "Terminal" in its name it's really a command line program. If fact, as we do with most programs, we launch Minicom from LXTerminal.
Let's take a look at the script:
Thanks to the Arduino IDE, writing the script (C code) and getting the code onto the microcontroller is simplified. The functions for serial communications, time delays, and handling digital and analog inputs and outputs to the ATmega device are available without adding libraries. There are not even any #include statements in my script. Once you write the script, you don't worry about makefiles. One click handles compiling the code and uploading the machine code to the microcontroller. All of the built-in functions as well as basic C language is included in a very useful help reference. There are other libraries included in the software, as well, that you can include in your scripts. For example, there is a library for the 16x2 LCD displays, and a 1 Wire library (see my temperature sensor posts).
My script is divided into three parts: variable assignments, functions I have written, and the main part of the program. The main part of the program is divided into two parts: a "setup:, and "loop", as required by the IDE. "setup" runs when you first apply power, or first upload the code. It establishes the UART baud rate at 9600 bps and makes the ATmega pin we connect to the RF transmitter an output. "loop" runs continuously after that.
Looking at the variable assignments, there is something I do not understand. You see the variable "one_pulse", it establishes the minimum pulse width in microseconds. This is the time for the "H" or "L" pulse (see below), the time for all other characters are multiples of the time for "H" and "L". From the Audacity display, using the camera remote control transmitter, I calculated the time for "one_pulse" to be 284 microseconds. The time in my script, after altering "one_pulse" to match the patterns, is 415 microseconds, a difference of 46%. I've done a bunch of testing, and it's a mystery to me. If anyone has an idea please let me know.
The heart of the script is the two functions, "send_bits" and "transmit". As we cycle through the pattern we send the characters to "send_bits" then "send_bits" passes the definition of each character to "transmit". "transmit" sends the pattern to the microcontroller. I've repeated the diagram from the last post here to make it clearer how the characters, "H", "L", "E", "A", "B", and "T" are defined:
An important note: The 434 MHz transmitter inverts the bit logic. If I send a low pulse from the microcontroller, to the RF transmitter, the RF transmitter sends a high pulse. That is why a low is written to outpin when the variable "is_high" is true. I think the rest of the code should be pretty straight forward.

Saturday, August 10, 2013

Gertboard - Camera Remote Control - Finding The Patterns

Continued from last post
There are two common types of remote controls. The remote control for your TV, DVD player, and audio system are generally infrared devices. They are good for short range, line of sight, applications. There is not much that can interfere with their signal, but they must be aimed to function. My camera remote control is a radio frequency device, which does not have to be aimed, operates over a longer distance, and can communicate through walls. Generally, there are two carrier frequencies (the constant, high frequency, that carries the useful information) in use for these devices, 315MHz and 433MHz. My remote uses 433MHz (I see references to both 433MHz and 434MHz for the same hardware - I use both freely in my blogs because I can't keep track of which number I used last).
My project really comes down to replacing that transmitter you see in the photo with my Pi/Gertboard/Atmega. To do that I have to eavesdrop on the transmitter when it talks to the receiver. I did not develop the means to do that, the technique was passed down to me by my son on his blog. You can see his post here. My setup, and a diagram of the connections follows:
The 433MHz receiver, from Sparkfun, is the green rectangular printed circuit board sitting on the breadboard. The black square thing hanging off the breadboard is the 3.5mm stereo jack. It can be purchased from Sparkfun. The jack's pins would not fit into the holes of the breadboard so I soldered two wires to the jack's pins and plugged those two wires into the breadboard. Looking at the bottom of the jack with the four pins to the back (like in the photograph) the left most pin connects to the 1Mohm resistor. I soldered a wire across the ground connection at the front to connect to the circuit ground. That connection is fairly obvious from the photo.
I added the capacitor from habit of adding bypass capacitors to circuits whether they need them or not. It's purpose is filter out any garbage on the 5V power connection to the receiver. It's probably not doing much so if you don't have one don't bother about it.
The receiver requires 5V plus or minus 0.1V. My 5V is supplied from a 5.1V, 0.7A, cell phone charger I happened to have lying around. The charger has a USB connector so I used Sparkfun's Hydra cable to connect to the binding posts of the breadboard assembly. It has a USB on one end a several connectors on the other end. I used the cable's alligator clips to clip onto the binding posts of my breadboard.
Be very careful with the Hydra cable assembly. It would be very easy for the two alligator clips to short to each other, or for the positive 5V alligator clip to short to the sleeve of the barrel connector, thus shorting out your power source. Always connect the alligator clips to the circuit before applying power. I have binding posts on the breadboard assembly so it is easy to keep the alligator clips apart from each other and I'm careful not to move the assembly while under power to avoid one of both of the alligator clips getting loose.
The antenna is optional. It's easy to add, just stick a wire into the breadboard. If you wish to use one, the length of the wire is fairly important. Make it a quarter wavelength long. To figure the length divide the speed of light, 300,000,000 meters per second by the frequency and divide this by 4. This gives you the length in meters. For my receiver, the calculation is (300,000,000/434,000,000)/4. This computes to 0.173 meters or 6.8 inches.
OK, so what do we do with this receiver setup? We take a 3.5mm stereo cable and plug it into the receiver circuit and into the line input of your PC or MAC. Then, you launch an audio recorder and editor program called Audacity. It's free, open source, software, and it's great. Here is a link to the site where you can download it. When the RF receiver receives the modulated signal from the Canon remote control transmitter it demodulates it (removes the 433MHz carrier) and outputs the intelligence it received from the remote control transmitter. This signal is within the audio range, so can be recorded by Audacity. Audacity, while capturing the signal displays the waveform like you would see on an oscilloscope. Once captured, you can zoom into any portion of the waveform. So let's look at some waveforms (Click on an image to enlarge):
The waveform above shows the receiver output zoomed all the way out. About 5 seconds after I started the recorder I pressed the exposure button on the camera remote control transmitter. What you see in the waveform is mostly a constant amplitude of random noise. Where it says "Pattern In Here" you can see a slight difference in the display. After a time, the noise reappears.
Above shows two separate data captures where we have zoomed in to show about 50ms. of the captured data. This is within the "Pattern In Here" area. Each waveform shows one complete sequence of the camera's remote control transmit pattern. The bottom waveform, is the result of pushing the exposure button all the way down as if you were actually taking a photograph. The top waveform shows the pattern resulting in pressing the button partially down as if you were previewing the camera data or waiting for just the right moment to push the button all the way down to make the exposure. This preview mode is not particularly useful for my project. I just wanted to show that different patterns are possible. The difference in the patterns is slight. Look at the location at the black vertical line, and look at the very end of the pattern.
In the above graphic, we see the a pattern captured when the camera remote control button is pressed (top waveform) compared with the waveform captured from my project's 433MHz transmitter driven by the ATmega microcontroller on my Gertboard. As you can see the two patterns are identical.
The two figures above were made possible because Audacity allows you to display multiple captures on the screen. You can then slide either capture left or right to line them up to see the differences or similarities between the waveforms. If you were a musician, each capture could be a track of your music. You can play one or more tracks while recording a new track. You could save the combination of multiple tracks as your song. Great stuff.
Now that we can see the pattern we have to analyze it. It was not particularity easy because, as you can see from the waveforms, the vertical edges were not very vertical. My method was to establish the time for the shortest pulse width. The beginning of the pattern has a low going pulse followed by a high going pulse, then low, high, and low going pulses. These five pulses are the shortest pulses. So everything is computed in relation to the time of these pulses. This pulse width is calculated to be 284 microseconds.
Rather than trying to break the pattern down to 1's and 0's, I resolved the pattern down to 6 sub-patterns. They are pretty well enumerated from the figure above. The only sub-pattern not shown is the long high going pulse at the end of the pattern. This "T" pattern is the length of 34 basic pulses. Therefore, the entire pattern can be expressed as:
Exposure: LHLHLEAABBAAAAAABABAABBABBBABBBBBBBABBLT
Preview:  LHLHLEAABBAAAAAABABAABBABBABBBBBBBABBBLT
Now that we have the patterns we can write some code. That's for the next posts.

Sunday, August 4, 2013

Gertboard Project - Camera Remote Control - Introduction

I have a remote control device for my camera that consists of a receiver and a hand-held transmitter. Wouldn't be great to have a motion detector trigger the camera using this remote control receiver? I could also take exposures at timed intervals without my being near the camera. The Pi and the ATmega microcontroller on the Gertboard provided most of the hardware to do the job. I only needed to add a $10 motion detector from Adafruit, and a four dollar 433MHz RF transmitter from Sparkfun.
I have not made an enclosure so everything is just loose wires and components. When using the motion detector, once the ATmega microcontroller is programmed, the Pi can be disconnected. Just a source of 3.3V will be required. While the Pi and Gertboard combination is a good development platform, a standalone project that does not require a user interface would be much simpler. Adafruit has a couple of ATmega32U4 development boards that are tiny, cost only $20 and can be programmed over USB. Check out this one, and this one.
This is what my Pi-Gertboard combination looks like now:
Tangle of stuff to Remotely Control My Camera From a Motion Detector
I have a Canon RF remote control receiver and transmitter combination. It works with Canon's digital SLRs and their G series cameras. Luckily, the documentation that came with the remote control gave me the frequency of the RF devices. It is 433MHz, which, I believe, is more commonly used in Europe. References to the frequency of these devices seem to be reported as 433MHz or 434MHz. Most RF remote control devices manufactured in the US use 315MHz. Sparkfun also supplies receivers and transmitters for 315MHz.
The following is a block diagram of the setup showing only what I thought was necessary to show. You may wish to click on the image to see the fine details.
Block Diagram
The next post will report on how I ascertained the information the remote control transmitter sends to the remote control receiver to control the camera. Subsequent posts will present the code to control the camera using the ATmega microcontroller.