Initiation to robotic arm control and image processing with CMUCAM
module
O. RADI- B. SIMON- M. MARTHIENS DAGORETTE- Ph. DONDON
Bordeaux INP, ENSEIRB-MATMECA, Av Dr. A. Schweitzer 33405 Talence, FRANCE.
Abstract: This aim of this paper is to share novel or different didactical experience, taking into account
evolution of student’s behaviour. We describe briefly some of these major changes. The consequences on the
quality and efficiency of traditional pedagogy are then pointed out. From these observations, we show that
didactical adaptations must be done, in particular to improve interest, involvement, and motivation. As an
example, we present here a didactical project called initiation to robotic arm control and image processing.
Technical approach and design are detailed. Finally, we discuss the advantages of our approach and give
results we obtained through this didactical process.
Key words: Image processing, robotic arm, mixed digital/analogue circuits design, learning by project.
Received: August 17, 2021. Revised: April 12, 2022. Accepted: May 7, 2022. Published: June 24, 2022.
1. Introduction
Since several years, French national statistics
shows global demotivation for the scientific
curriculum and disaffection for all theoretical
lessons. This has been recently amplified by the
covid period [1]. Economical, commercial, cultural,
sport education studies seem to be now more
attractive for the university’s students.
In our electronic department of engineer school
ENSEIRB-MATMECA, we observe in particular, a
kind of increasing gap between the student’s needs
and what we proposed to them.
This student’s evolution generates a general loss in
term of teaching efficiency [2]. In particular, we
can point out, in our electronic department some
specific problems:
- Till now, our generic and basic theoretical
courses were a full classroom traditional teaching
with ten sessions of one hour and a half each.
However, we observed a loss of interest for these
courses and a higher absenteeism rate than before
despite a strong checking.
- Although the situation for practical
teaching is a little better (no absenteeism), it is far
from perfection: in particular, student’s projects in
second year of study were classical electronic
design projects scheduled over semester, with
several phases, bibliography, theoretical analysis
and computation, practical design and measures.
Year after year, we observed, among other facts,
that projects were no more finished on time. A
probable lack of prerequisite and difficulties to go
from theory to practice, [2] are the reasons of this
situation.
Similar changes in students needs and practice are
observed in other countries and particular care is
taken by colleagues in many institutes [3], [4], [5].
Regarding this evolution, several modifications
were done in our school. Theoretical lessons are
now a mix between theoretical explanations and
direct immediate exercises to apply courses. For the
practical teaching part, we tested an alternating
approach for the student’s project. We give in this
paper, a typical example of project we introduce
last year and we give design details, explain how
the students worked to obtain interesting results and
a finished project.
2. Student’s project example
2.1 Student’s project organisation
Each year, a team of teachers gives practical multi
thematic projects to illustrate the main topics of our
school: analogue, digital, radiofrequency, power
electronics, programming, and so on. These
projects are done over one full semester, with one
session of three hours per week. Students work by
pair.
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
147
Volume 19, 2022
2.2 The robotic arm project
As an example of practical learning strategy, we
describe here a student project we made one year
ago. Originality of this student’s project is to
associate a robotic arm, a micro controller board
and a camera to make some ‘intelligent’ actions
and to reproduce a typical human movement. It was
voluntarily an open project without too strong
specifications. Thus, students feel free to decide
their own scenario and to explore the capacity of
this actuator. Thanks to a modular design approach,
it allows focusing on a global system design and
concrete understanding. The chosen scenario is
described in next paragraph §2.3.
2.3 Chosen scenario
After discussions, bibliographic researches and
evaluation of realistic abilities compatible with
student’s skills and duration of the project, they
chose the following scenario:
The system must recognize a green glass hovering
on a table nearby the arm. This one must explore
and scan the space around, find the location of the
glass and pointing to it. When it is correctly
oriented, the plier must grip the glass and take it
back to initial position.
The following paragraph §3, 4, 5 describes the
chosen electronic modules required to make this
project.
3. Processing board
In order to control the robotic arm we use a
classical Arduino Uno module [8] based on a micro
controller ATmega328 (figure 1).
Voltage supply is 5V. The clock frequency is
16MHz. It has 14 digital I/O pins and 6 analogue
input pins, I2C bus, which is enough for our
application. Some of them can deliver directly a
PWM signal (suitable for driving servomotor).
Figure 1: Arduino uno module
4. Short description of braccio arm
The braccio arm [9] is made of plastic. It is
dedicated to didactical applications (Figure 2a). It
consists of a base, 3 mobile segments, a rotating
plier with two ‘fingers’ and six servo motors
representing the bonds of the arm. Servo motors are
classical servo used by hobbyists. Angle of rotation
is proportional to the pulse width of the PWM
signal applied to the servo. Angle range is to
180° when pulse width varies from 1ms to 2ms. An
interface board for powering and connecting servo
signals is available into the package. An Arduino
open source library is also available for easy
management of servomotors with a classical
Arduino micro controller board.
Figure 2a: Braccio arm view
Once the full arm assembled, the mechanical
position must be calibrated to check correct
alignments of arms in neutral position. Indeed, due
to possible mechanical offset, (notched wheels of
servos), a small difference can be observed. Offsets
can be cancelled by software correction or by
mechanical fine adjustments.
This first calibration defines the “zero” or the
origin of each servo motor (Figure 2b).
Rotative wrist
Plier
or
‘fingers’
shoulder
Vertical wrist
Base (horizontal rotation)
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
148
Volume 19, 2022
Figure 2b: Servo motor mechanical calibration
5. The sensors
In order to make the braccio arm ‘intelligent’, a
camera is mounted on the top of last segment just
over the plier. So, “fingers” extremities are visible
in camera field of view (low part of the image). An
Infrared sensor is also added to detect an object
when it is between the plier fingers.
5.1 CMUCAM5 Camera
The most interesting sensor in the project is
undoubtedly the CMUcam [10], [11], [12], with its
CCD sensor OV9715 [13], and in-board image
processing circuit for target tracking or avoidance.
The first CMU CAM module (figure 3) was
designed by the Carnegie Mellon University a few
years ago. We use here CMUcam5.
Main specifications are:
- Processor: NXP LPC4330, 204 MHz, dual core
- Image sensor: Omnivision OV9715, 1/4”,
1280x800 [13]
- Lens field-of-view: 75 degrees horizontal, 47
degrees vertical
- Power input: USB input (5V) or unregulated input
(6V to 10V)
- Data outputs: UART serial, SPI, I2C, USB,
- Embedded image processing software
Since many digital pins are already used for servo
motors control, we chose a data transmission with
I2C bus (SDA and SCL connected to A4, A5
analog pins of Arduino module). An Arduino open
source library is available for easy management of
data transfer.
Figure 3: CMUCAM5 module
5.1.2 Object colour detection
Calibration and programming of the camera
requires the Pixymon open source software [14],
[15].
This allows first displaying what is seen by the
camera on the PC screen (figure 4.) Then, it allows
teaching to the camera (connected via USB port to
the PC), up to seven colors to be detected and to
assign a signature for each.
To do that, we first select in the menu « set
signature x » then, we select a rectangular/square
frame on the screen which define the desired color
to be detected. Once the color learned (red object
on example of figure 4), the camera will try to
detect all the pixels or group of pixels of which the
color is equal to the one selected previously. A
tolerance margin can be defined.
Figure 4: Camera in learning mode
Lastly, some general parameters such as white
balance, contrast and so on, can be adjusted with
pixymon.
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
149
Volume 19, 2022
5.1.3 Object detection with CMUCAM
The principle of detection [16], [17] is based on the
colour difference between the background and the
object to be detected. The CMUCAM module is
able to detect several pixel blocks with the same
colour. Then it returns information on the position
X,Y, size and average colour of each block. The
X, Y coordinates of a point are obtained according
to the given axis orientation (figure 5) and with the
following useful scaled range X: 0 to 400, Y 0 to
200.
X
Y
0
0
400
200
Field of view
Figure 5: Scale and axis orientation of the image
5.2 Infrared sensor
A simple matched emitter/receiver infrared sensor
(figure 6) is mounted on the plier. As the emitter
and receiver are very close (a few centimeters), it is
not necessary to modulate the IR light.
When the IR beam is cut, the presence of an object
between the fingers is detected. Then, it is possible
to press the fingers and to catch the object.
Figure 6: IR simple sensor
6. Global system architecture
Finally, the following global hardware architecture
(figure 7) is adopted.
We can see the power supply, the processor board
which will manage the movement of the arm. Servo
interface board is just an hardware interface to
connect the 6 servomotors of the arm. CMUCAM
is installed and fixed just over the arm plier (to see
the scene), while infrared sensor is mounted on
pliers fingers. (emitter on one finger and receiver
on the other).
uP
arduino Uno
Digital input
Power supply
+5V +3.3V
+12V
Infrared
sensor
Servo
motor
Interface
board
CMUCAM
I2C bus
Figure 7: Global architecture
7. Moving braccio arm towards
detected object algorithm
The heart and most interesting part in the project is
probably the understanding and definition of a
strategy to catch the green glass: which bond to
move? Which sequence to program? That is
finally, how to reproduce what a human is able to
do easily without thinking?
Neither mathematical modelling nor trajectory
equations were used. The strategy has been set up
using biomimetic approach [18]:
One student first tried himself to catch the glass on
the table with his own arm, while another observed
and recorded the movement of each segment of the
arm. After several attempts, it was possible to write
an algorithm reproducing as well as possible the
true movement.
Le principle is explained hereafter:
After power on”, a phase of initialization of the
entire arm’s segments position. When the system is
ready, the camera returns regularly the number of
detected bloc in its field of view. If no object
detected, the base servo turns, makes a horizontal
scan of space around and tries to find a colored
object. When an object enters in the field of view of
the camera, the base and wrist servo motors are
moved to place the object at the center of the
image.
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
150
Volume 19, 2022
Then, movement is stopped for a short while and
the camera stares the object without moving like a
human eye could do.
Then, depending on the situation and relative
position of wrist compared to forearm (normal,
bent, aligned) (cf. schematic view figure 8a), three
possible strategies to move towards the object are
adopted.
Normal wrist Bent wrist Aligned wrist
shoulder
elbow
wrist
arm
forearmhand
fingers
base
Figure 8a: Wrist position and angles definition
1) If wrist is in normal position:
Then, shoulder must move towards the object
(Figure 8b). During the movement, the feedback
control process will move the wrist servo to
maintain the object centered on the image. Thus,
the wrist will necessarily become aligned with the
forearm.
Normal wrist wrist aligned
shoulder
elbow
wrist
base
Shoulder
forwards
Figure 8b: movement in first situation
2) If wrist is bent:
Movement of the shoulder is reversed (figure 8c)
compared to the first situation. And the wrist will
also become aligned.
Bent wrist wrist aligned
shoulder
elbow
wrist
base
Shoulder
backwards
Figure 8c : Movement in second situation
3) Once wrist is aligned with forearm:
Object is now centered exactly between the two
wide opened ‘fingers’ (figure 8c). We do not move
anymore the wrist. Elbow is unfolded and the arm
is stretched out, by moving the shoulder towards
the object figure 8d.
Figure 8c: Object being centred between fingers
elbow
Shoulder
Figure 8d : Approaching object
If the arm reaches its maximum extension, it means
that the object is too far to be caught. Then, the arm
returns to a default position.
But, if the infrared beam is cut, it means that an
object is passing between fingers. Then, movement
is stopped, fingers are closed and after 1.5 second,
the arms returns to initial position carrying the
glass.
The global strategy is summarized in the flowchart
figure 9.
Programming such motion strategy represents for
the students, the writing of a few hundred lines of C
code. Program looks quite compact, because the
use of predefined high level functions for the
control of the camera and the servomotors.
Fingers extremities
object
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
151
Volume 19, 2022
Scanning around
(rotation of base)
Object
detected ?
initialisation
No
Centring object
(Base and wrist
rotation)
Pause
No
No
Wrist aligned ?
Elbow unfolded
Wrist position
?
bentnormal
Shoulder extension
IR beam cut?
Arm in neutral
position
Hand horizontal
Object centred in
the camera field of
view
Object too far
Press fingers
Catch the glass
Stop
Shoulder
backwards
Shoulder
forwards
Figure 9: Software algorithm
8. Validation tests and results
Some video clips were recorded. For these tests, the
braccio arm is powered by external power supply
equipment. The presented pictures (figure 10
series) are captured from the videoclip.
Figure 10a: Searching for the green glass
Figure 10b : Looking into the good direction
Figure 10c: Opening fingers
Figure 10d : Catching the glass with fingers
Figure 10e: Taking the glass
Several tests were performed with the glass placed
more or less far from the arm. Once the object is
detected and the fingers pointed into the right
direction, it takes around 7.7 second to catch it.
However, the system is not perfect: some erratic
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
152
Volume 19, 2022
behaviors still occur and sometimes the arm misses
the glass.
9. Project assessment
9.1 Technical assessment
As explained before, this student project was only a
simple initiation to robotics and to image
processing. Indeed, within a 40 hours framed
project, it is obviously totally impossible to design
a so complex and safe system as we can see in
automotive industry for example. However, our
technical goal was reached. After programming,
students were so satisfied to see their small arm
moving and catching the glass on the table. A great
performance with only modest non-professional
equipment (i.e. open source plateform and low cost
plastic arm).
The only “black point” is that the students have had
only an overview and global approach because the
modular design. They use electronics modules like
“black boxes” and did not study what is inside in
deep. So, some subtle electronic details (like pull
resistor on I2 bus, pins current sourcing ability…),
and other characteristics were not assimilated.
9.2 Didactical assessment
- Freedom and autonomy during the design gives
the impression to the students to be more creative
and responsible of their project.
- The practical aspects of the project (the robotic
arm is moving) are a source of interest and
motivation.
- Even if academic scientific knowledge must be
obviously taught and transmitted, this type of
project develops the necessary “know how” as
supplement of the knowledge.
- Absence of high level mathematical
considerations during the project leads to a
mandatory practical and physical mental
understanding of concepts. Instead of applying
formulas like automat, the students have to develop
other mental paths such as common sense,
reasoning, and imagination.
- The system approach allows connecting different
fields of electronic (Analogue, digital, sensors,
micro programming, and motor driving) which
seem often disconnected for the students because of
the segmentation of theoretical courses.
- Lastly -as collateral interest- this initiation
project fits the needs of our robotic student’s team
and may help them for the participation to the
French national and annual robotic contest [19].
9.3 Comparison with previous
methodologies
It is quite difficult to compare teaching
methodologies since they change at the same time
than students need and behaviour. To compare
properly two methodologies should have required
to test on two students samples from the same
generation two method. Unfortunately, this was
not possible because of time table constraints.
What we can observe is that learning by project is
more suitable today especially after the covid
period [20], [21] because the loss of theoretical
bases and ability to focus during these last two
years.
10. Conclusion
Study and design of a robotic arm with its control
camera has been presented, within the framework
of our students’s project in second year of study at
ENSEIRB MATMECA School. Technical and
didactical results are very encouraging. Thanks to a
modular design and a system approach, the project
was finished on time and students were satisfied
with the technical content. It was a good initiation
and a good preparation for their ultimate industrial
project in third year study.
This didactical and practical work seems to be an
interesting answer to the students’ needs and
general behaviour evolution. Thus, similar projects
will be proposed over the future years.
References:
[1] C.Tan The impact of COVID-19 on student
motivation, community of inquiry and learning
performance Asian Education and Development
Studies Vol. 10 No. 2, 2021 pp. 308-32 Emerald
insight
[2] Ph. Dondon- J.Micouleau, G. Leroyer, N.
Daddato Introducing embedded system concept
through a multi thematic funny hexapod robot
design project (L et Cie, 78000 Versailles), "
WSEAS EDUTE08 2008 Corfou (Greece)
[3] Thu T. K. Le « Project-based Learning in 21st
Century: A Review of Dimensions for
Implementation in University-level Teaching and
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
153
Volume 19, 2022
Learning. 4th ICEAC International Conference
on English Across Cultures 2018
[4] A. Markula, M. Aksela, The key characteristics
of project-based learning: how teachers implement
projects in K-12 science education. Discip
Interdscip Sci Educ Res 4, 2 (2022)
[5] O. Haatainen,, & M. Aksela, M. (2021).
Project-based learning in integrated science
education: Active teachers’ perceptions and
practices. LUMAT: International Journal on Math,
Science and Technology Education, 9(1), 149173.
[6] M. Avila, JC. Bardet, S. Begot, P. Vrignat, N.
Stride « La pédagogie par projets » CETSIS Nancy
25-27 octobre 2005 France
[7] F. Vincent, B.Mouton, C.Nouals, « Radar de
poursuite » CETSIS Toulouse 13-14 novembre
2003 France
[8] Arduino open source web site:
https://www.arduino.cc/
[9] on line “Braccio quick start guide” : web site
https://docs-emea.rs-
online.com/webdocs/14da/0900766b814da22f.pdf
[10] Ph Dondon- P.Greselle manuel d’utilisation
CMUCAM Enseirb © 2007.
[11] B. Rahmani, A. E. Putra, A. Harjoko, T. K.
Priyambodo, Review of Vision-Based Robot
Navigation Method”, IAES Int. J. Robot. Autom.,
vol. 4, no. 4, pp. 3138, 2015
[12] B. Rahmani, H. Aprilianto, H. Ismanto,
H.Hamdani Distance Estimation based on Color-
Block: A Simple Big-O Analysis International
Journal of Electrical and Computer Engineering
Vol. 7, No. 4, August 2017, pp. 2169~21 75
[13] on line OV9715 sensor data sheet:
http://www.ovt.com/sensors/OV9715
[14] piximon application open platform:
http://www.cmucam.org/projects/cmucam5/wiki/Pi
xyMon_Overview
[15]on line pixycam documentation :
https://docs.pixycam.com/wiki/doku.php?id=wiki:v
2:overview
[16] O. Yaseen Ismaela, J. Hedleyb Development
of an Omnidirectional Mobile Robot Using
Embedded Color Vision System for Ball Following
American Scientific Research Journal for
Engineering, Technology, and Sciences Vol. 22, pp
231-242 2016
[17] Z. LIN, X. PING Design of the Vision
Positioning Grab Mobile Robot System
International Conference on Artificial Intelligence
and Computer Engineering 18-19 june
2016,Wuhang, China
[18] G. Boeuf « Biomimétisme et bio-
inspiration » Vraiment durable, pp 43-55
Victoires éditions, 2014
[19] French robotic cup web site,
https://www.coupederobotique.fr/
[20] M. D Guevara Espinar, J. Josy Lévy, «
COVID-19, confinement et répercussions chez des
étudiant(e.s) universitaires espagnol(es) : une
analyse exploratoire », Enfances Familles
Générations [En ligne], Articles sous presse,
Numéro 40 - Dossier Familles au temps de la
COVID-19, mis en ligne le 24 mai 2022,
[21] France3 TV web site https://france3-
regions.francetvinfo.fr/auvergne-rhone-alpes/puy-
de-dome/clermont-ferrand/clermont-ferrand-etude-
montre-que-confinement-aggrave-inegalites-entre-
etudiants-1842874.html
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the Creative
Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en_US
WSEAS TRANSACTIONS on ADVANCES in ENGINEERING EDUCATION
DOI: 10.37394/232010.2022.19.16
O. Radi, B. Simon,
M. Marthiens Dagorette, Ph. Dondon
E-ISSN: 2224-3410
154
Volume 19, 2022