Sunday, November 10, 2013

RS4 - Self balancing Raspberry Pi OpenCV image processing Robot

Here is the robot that I'm working on, you can see the latest video here, although it suffered some modifications since then.


I'll divide this description in topics as it's easier for me to describe it this way. The idea to build this robot came from buying a Raspberry Pi, when I saw it I said "I've got to build a robot with this " :) . I have built other robots in the past but this one is the most complex and the first with image processing.

Chassis 

The robot chassis was designed by me, I used a 3D tool to generate some previews mainly because I needed to have an idea of the size and components distribution before build it. Here you can see the model of the robot:





After this I began the building process, I bought a carbon fiber plate (more or less the size of a A4 sheet) and cut all the pieces by hand with a mini drill machine (unfortunately I don't have a CNC machine to do this job). I bought some aluminum profiles to make spacers and fixing parts as you can see in the next photo. The result is a very light and strong chassis.





Motors and wheels

I'm using stepper motors in this robot, no special reason for that. I bought them as Nema 17 motors, the motors reference is  LDO-42STH38-1684A 121121 LDO MOTORS. These type of motors have a nice robust look and are usually used in CNC and RepRap machines.
The wheels are from a RC 1/8 Buggy, you can find it easily in any RC store as they are standard size. What I like the most in this wheels is their soft touch, this way they work as a damper for small obstacles allowing smooth run.

To connect the wheels to the motors I used Traxxas Revo hubs and nuts like showed in the photo, these are the only ones that I found with a 5 mm hole, the same as the motors shaft. This way is more or less plug and play.




Head

For pan and tilt I use 2 micro servos (Tower Pro MG90S), very cheap and easy to get. The head has a holder for the Raspberry Pi camera module, a ultrasonic sensor and 2 RGB LEDs.
You can see some details of the robot in the next photos




Balancing and motor control Board

This robot uses a dedicated board for balancing and motor control (I want to use Raspberry Pi only for high level tasks). This board is my design and it uses the following components:
 - 2 L298 + 2 L297 stepper motor drivers, (yes, I know they are old but they are cheap and easy to find to, in a future revision I'll use something from this century :) );
 - Murata ENC-03 Gyroscope, analog single axis gyro, very easy to use;
 - MMA7361L Accelerometer, 3 axis analog accelerometer (I use a module, this chip is to small to hand soldering);
 - PIC24FJ64GA002 microcontroller
It allows I2C and serial communication. Photo of the board and the motors here:


Servo control board

I'm using a modified motor board to control two servos and to read the ultrasonic sensor (not yet being used). This is a temporary solution, I intend to design a dedicated servo control board or buy one.


Power

The energy to power the robot comes from a 2000 mAh LiPo 3S battery. To generate required voltages I'm using one 3.3V regulator and two 5V switched regulators. I want to design a dedicated power board in a future revision.

Balancing control 

PID

Balancing control is performed by a PID cascade, like showed in the next picture. This way is possible to balance the robot even if you move the center mass or run it in a ramp. It will find a new balance angle that allows it to be balanced and stopped. In fact both the controller are PI only, the derivative gain is set to 0 because it causes the robot to shake even with small gain. 




  PID implementation is as simple as this:
    pTerm = Kp * error;
    sum += error;
    iTerm = Ki * sum*Ts;
    dTerm = Kd * (error - lastError) / Ts;
    Cn = pTerm + iTerm + dTerm;
    lastError = error;
For PID tuning I used a Bluetooth module which allows me to adjust Kp,Ki,Kd for both the controller in real time. This way you can immediately view the effects and reach the desired behavior for the robot. In this video you can see it successfully balanced for the first time .




Sensor fusion

Sensor fusion (gyroscope + accelerometer to get the leaning angle) is performed by a Kalman filter, not much to say about it, it works really well.   Follow this fantastic tutorial, here is everything you need to know, includes explanation and implementation.
OK, the robot is balanced but now it is necessary to move it. Moving forward an back is quite easy with this PID cascade setup, you just have to give a set point to the first controller and it will calculate the appropriate leaning angle to reach that speed. Something like this:

  setAngle = calcCn1(instSpeed - setSpeed);
  instSpeed= calcCn2(angle - setAngle);
To turn the robot I'm attenuating the speed in one wheel, depending on the side it needs to turn. This way the robot keeps the balance as both wheels are reflecting the control system speed. Implementation looks like this:
 instSpeedL = instSpeedR = instSpeed;
 motorSpeedL(instSpeedL * factorL);
 motorSpeedR(instSpeedL * factorR); 
 0 ≤ factorL ≤ 1,     0 ≤ factorR ≤ 1  
To perform spins, rotating in turn of itself, what I do is to give an opposite offset speed to the wheels. With the wheels rotating symmetric speeds it will perform a spin and stays balanced, completing the implementation it will look like this:
 motorSpeedL(instSpeedL * factorL + spinSpeed);
 motorSpeedR(instSpeedL * factorR - spinSpeed); 
 If spinSpeed is positive the robot will spin clockwise, other way it will spin counter clockwise.
That’s the way I found to control the robot motion, there are possibly other methods. Other important thing is that with stepper motors you shouldn't apply big speed changes abruptly or they will slip, this can be solved with some low pass filter applied to factorL/R and spinSpeed. This way works well in my robot. In this video you can see the a run with Bluetooth control, it can run faster than this but will easily fall if it finds some small bumps on the road.


Raspberry Pi

I'm using a Raspberry Pi model B 256 MB with a micro SD adapter because of the limited space on the robot. I have a small WiFi adapter but the robot is not yet using it. The installed operating system is Raspbian, I managed to get OpenCV working with the Camera module thanks to this tutorial, great stuff here:
At the moment I'm using serial communication between the Raspberry and the motor board and servo control board but I intend to use I2C as it is a more appropriated method. The reason I'm using serial now is because the interface code was already done for the Bluetooth module (it is a cheap serial Bluetooth module). I have to spend some time working in the I2C interface.
Serial interface with the Raspberry is quite easy, you just have to disable terminal emulation. I'm using WiringPi library to achieve serial communication and to control Pi's GPIOS without any issues.

Image processing

I have very little experience with image processing, it is the first time I'm using OpenCV and I'm still learning how to use it. My first example is the object tracking (ball) by color filtering like in this tutorial:
It works well but is sensitive to lightning changes, at this moment I'm using YCrCb color space instead of the HSV but the results are similar. With the object coordinates in the screen I control the servos to point the camera to the object and control the robot direction based on the head angle. 
The ball following was the first simple example to integrate all the parts of the robot, the robot behavior was funny and I decided to publish the video on youtube. 

Final remarks

This robot is an ongoing project, I'm continuously building new parts and modifying others. I don't have a defined goal for this robot, I would like to give it some autonomous navigating capabilities. It has some real potential I just have to work on image processing and learn some more technics. I intend to add a speaker too.
In the initial robot sketch it has 2 arms, it would look cool but it gives a lot of work to build and I'm aware that is hard to give it some useful function like grabbing objects or something. I could use arms to get the robot back on balance after a fall, maybe in a future update.
I have implemented odometry in this robot, at the moment I'm not using it. A 3 axis gyro would be very useful to correct odometry angle errors, a point to review in future revision.




88 comments:

  1. how do you measure the robot's speed?
    thanks

    ReplyDelete
  2. You don't need to measure speed with steppers. They just do the steps that you want, your input is the real speed unless they slip.

    ReplyDelete
    Replies
    1. Hi! I'm not sure if this page is still active but are you saying you feed the output of the second loop back into the first loop?

      Delete
    2. Hello,
      It works like you see in the figure above. The first PID gives the set point to the second.

      Delete
  3. Hi!
    I wonder about 2 PID controller for angle and speed are similar?
    I think output +=pTerm + iTerm + dTerm for PID speed controller
    And output = pTerm + iTerm + dTerm for PID angle controler
    What do you think about that?
    Thank!

    ReplyDelete
    Replies
    1. Hello Kim

      Yes, you are right. Both controllers are the same in implementation but with different gains (Kp,Ki).

      Delete
  4. Hello!
    I´ve been trying to use OpenCV to do a work with a robot, also, but it reveled to be a daunting task.
    How did you cross compiled to the RPI? Did you programmed and debugged in VS?
    Thanks!

    ReplyDelete
  5. Hello André (are you Portuguese?)
    I'm not using cross compile for the OpenCV. I'm programming directly on the Pi.
    Sometimes takes a long time to compile.

    ReplyDelete
  6. Hello!
    Thanks for your reply. Yes, I am :P I´m in Povoa de Varzim, what about you?
    I tried to do all the programming in the RPI with geany, but I given up that idea because of the speed. I thought I could do a lot better with a familiar interface like VS along with the speed of a few gHz cpu, but I failed to compile openCv libraries for windows with errors I know nothing about and found difficult to find solutions on the web.
    So I tried with Linux on a VM, and it worked, if it was not for some hardware problems related with crossing the usb camera from the main OS to the VM OS.
    I finally decided to grab an old computer with Linux and compiled the libraries there. It is working fine (with some adjustments) but the best I can do is to place the code there and debug it, and then grab that *.cpp and place it on the PI and compile it there. So far all the code that worked on the computer also worked on the RPI, wich shows a good compatibility between OpenCV versions.
    Your work is very impressive and it´s very nice to see the that robot doing it´s thing.
    Please keep posting new stuff!
    Thanks.

    ReplyDelete
    Replies
    1. Perguntei porque o teu nome parecia português. Eu estou a morar no Porto.
      I'm doing something similar to what you're doing, I'm using a Linux virtual machine with OpenCV on Eclipse IDE. I do all my coding first on the computer and when I get something I like, I compile it in the RPi. Never had a problem with compatibility between OpenCV versions.

      From now it will be hard to work in the robot, I have a new job and I have not much time for it. If you need something just ask.

      Delete
    2. Até estamos próximos. Queria ver se conseguia dominar a linguagem do OpenCV o mais rápido possível porque queria ter o robot em que estou a trabalhar pronto até Janeiro. Mas ando um bocado á nora com os processos do OpenCV. I´m really not very good with C++, so it´s being difficult to elaborate a code for what I want. The arrows functions that you have developed are exactly one of the features I was thinking on putting in the robot. How were you able to master OpenCV? Did you follow the tutorials or something? Sad to know you will have little time for the self balancing robot. But hope to see some more amazing projects from you.

      Delete
  7. This comment has been removed by the author.

    ReplyDelete
  8. Hi!
    I was wondering, is the raspberry self contained? What I'm trying to say is did you run opencv on a monitor and then disconnected the pi, or is it communicating in any way maybe wireless to a monitor?

    ReplyDelete
    Replies
    1. Hello,

      I'm using external monitor,mouse and keyboard. To test it on the floor I unplug all (with application running). You can see it in the photo.

      Delete
  9. Do you recommend a place to start with openCV for robotics ?

    ReplyDelete
    Replies
    1. Hello

      I didn't follow any particular tutorial to begin with OpenCV. What you can do is install it in your PC and start playing with it. There are many examples in the web of object tracking that you can try. Even in YouTube you can easily find some tutorials on that.

      Delete
  10. Hi, I have just got my 1st raspberry pi and saw your project and thought it was cool and would be something that I would like to build. Could I use a usb webcam instead of the pi webcam? How are the servos driven for the head as it wasn't clear from the info you have shown.is it from a separate driver or directly from the raspberry pi?Thanks for any help you can give.

    ReplyDelete
    Replies
    1. Hello Paul
      Yes you can use a USB camera but it will be slower than the Pi camera board (see the thinkrpi website). Head servos are not controlled directly from the Pi, it is a microcontroller that generates the signals.Good luck for your work :)

      Delete
  11. Hi Samuel. Appreciated! Cool job! :)
    I also tried to build first self balancing draft a week ago. But physics of "Brushless Gimbal Controller" project not so appropriate for Self-Balancing robot, but anyway I got it standing :) http://www.youtube.com/watch?v=4eFs3Q4aCfk :) Next step - I'm going to adapt multiwii project (I'm old fun and multiwii developer in past) with soft drivers for brushless gimbal motors...

    Also does it possible to look through the code of your project? It's open source?

    thx-
    Alex

    ReplyDelete
    Replies
    1. At the moment my code is not open source. I am with no time for this project at the moment. I'll try to update the blog with more information soon.

      Delete
  12. Bom dia Samuel, excelente trabalho. Temos um pequenino projeto chamado Jabutino, usamos o Arduino nele, quando puder procure no Youtube. Estava a trabalhar num Hexapod quando me deparei com seu projeto... parei tudo... tirei a poeira da a minha raspberry ... Fantástico. Gostaria muito de aprender com seu projeto, tenho algumas idéias para inclusão social de deficientes... imagino um mudo interagindo com um robo, como o seu, através de gestos e linguagens de sinais, ou uma criança aprendendo formas geométricas e cores junto ao robô..., mas vamos ao que eu gostaria de saber se você puder me informar, por favor fique a vontade ... Pode dar uma dica sobre com fazer para o robô correr atras da bola ou reconhecer uma instrução através das plaquinhas? Muito agradecido, Luis Oliveira (Brasil, Salvador-Bahia)

    ReplyDelete
    Replies
    1. Olá Luís,

      Tens alguma informação aqui no blog acerca de como foram implementadas as estratégias de visão artificial com o OpenCV. Quero adicionar aqui mais informação e inclusive código que possa ser testado mas para já não tenho tido muito tempo livre. Quando puder vou actualiazar o blog com mais informação.

      Delete
  13. Olá Samuel, como tem passado? Bem espero!
    Foi bastante trabalhoso colocar a Camera nativa da Raspberry para funcionar mesmo seguindo o tutorial que você recomendou, falizmente depois de muito "apanhar" deu tudo certo.
    Estou a seguir o tutorial www.youtube.com/watch?v=EEajP-dGTLY não está fácil. Estou com problemas na execução do programa ObjectTrackingTut.cpp. Ele compila sem nenhum erro, porém quando executo aparece o seguinte erro: Assertion failed ((scn ==3 || scn == 4) && (depth ==CV_8U || depth == CV_32F)) in cvtColor, file color.cpp, line 2957. Desculpe por abusar da sua paciência, mas tem alguma ideia do que pode ser? A camera esta funcionando perfeitamente com os programas de teste estou usando a OPEN CV 2.3.8

    ReplyDelete
  14. Hi, well done! I am doing something similar for a school project, we are tracking a color using the Raspberry Pi camera board. I find that when runing OpenCV on the Raspberry Pi, my image tracking is running very very slow(about 2-5 fps) which is entirely too slow for real time object tracking. Your robot seems to be responding very fast. How much fps do you get? How did you acheive this? Any tips would be great!

    ReplyDelete
  15. Hello

    Maybe you are using a higher resolution. I'm using 320X240 and it works good. Color tracking I think I get around 15 fps.

    ReplyDelete
  16. Hi
    A big thank you for your explanations
    I could not find how to limit the speed!
    your method works too well
    sorry for my bad English ...
    Regards
    François

    ReplyDelete
  17. Hi Samuel, compliments for your bot. Is the inner PID (angle) executed the same number of time/second of the outer PID (velocity) or the inner PID is executed more times than the outer?

    tks

    ReplyDelete
    Replies
    1. Hi :)

      Both are executed at the same time interval, every 10 ms.

      Delete
  18. Hi Samuel.
    Why you decided to use a Kalman filter and not a complementary filter?

    What advantages does the Kalman filter on the complementary filter? More accurate? Faster?

    To use I2C gyroscope + accelerometer? Bitwise? Or SMBUS?

    Thank you.

    ReplyDelete
    Replies
    1. Hello Juan,

      I have simulated both filters and the Kalman is a little better, it is also more complicated and needs more processing. You can try both and see whats the best for you.

      I don't understand your last question...

      Delete
    2. Sorry for my english.

      How you acquire data gyroscope? Using I2C? The speed is enough?

      What resolution you use in stepper motors? Changes according to the state? For example: To go forward = full step
      To stay balanced = 1/16 ?

      Thanks in advance

      Delete
    3. My gyro has analog output, I have to use ADC to convert it. I've used I2C gyro in other projects and it is fast.

      I'm using half step in the motors, unfortunately the drivers don't allow better resolution. I have a redesigned motor board with better drivers, but it is only on paper...

      Delete
  19. This comment has been removed by a blog administrator.

    ReplyDelete
  20. How did you do, that your robot doesn't turn back to spot when you push it? I made similar balancing robot with cascade PID but mine always comes back to spot and it is diffucult to control it to move because after stopping it always turn back. In your video I can see that your robot always stays in new place after pushing.
    I used DC motor instead of steppers and I measure the speed with encoders but I think it doesn't matter. The PID look and works the same...

    ReplyDelete
    Replies
    1. Hi Hubert,

      In fact my robot has the same behavior that you describe, in the video that you see me pushing it with my hand it is working with only one PID stage, this way it don't return to the starting point.
      I don't see why is this a problem for you, can you explain?

      Delete
    2. I wanted to make it move forward and backward (spins and steering I want to add later) with my TV remote control and IR receiver. I did it in this way:
      I use three buttons -" forward", "backward" and "stop".
      When I press "forward" button I set the setpoint for speed PID for a particular positive value . Then when I press "stop" button I set this setpoint to 0 as it id by default when robot doesn't move. "Backward" works same as "forward" but i set the negative value as a setpoint.

      Speed PID calculates the leaning angle for angle PID to start move, but it doesn't move slightly. It moves for exaple forward but after a while it looks like it was trying to stop (it slows down) and then it accelerates again but with a greater speed. After several such oscillations it falls down.

      Also when I press "stop" when the robot moves forward or backward it usually turns back to the place where it was before doing movement forward or backward. For instance it goes 1 meter forward and after stopping it goes 1 meter backward so effectively it stays at the same place.

      I don't know what I am doing wrong. I just change the setpoint for speed PID just like you wrote in your post. Maybe I also should make some stopping routine, not just immediately change the setopint to 0, but for instance decrease it slightly until it reaches 0?

      Delete
    3. This comment has been removed by the author.

      Delete
    4. This video shows this situation.
      http://vimeo.com/102473404

      As you can see it moves with oscillations and it also doesn't brake slightly. It looks like my PID, especially speed loop isn't tuned well. When it tries to achieve a setpoint speed it overshot and in effect it slows down and undershots and this process is going forever. The same situation is during braking. When I change speed setpoint to 0 and current speed is much greater than 0 it tries to achieve 0 very quickly so the robot brakes very sharply and becomes unstable.

      I have no idea how to tune it to work well. When it stands on the spot it looks good. It's very difficult to tune PID while robot moves... :(
      Did you have such problems during tuning your PID including the speed outer loop? Maybe you did it in some special way...

      Delete
    5. It looks a PID tuning problem,I've seen similar behavior in my robot but with lower oscillation frequency. I don't have any special way, just trial and error...
      I my case the speed set point is increased/decreased, mainly because of the steppers, they slip if the speed increase/decreased is to high. Try it in your case.
      I'll try to make a video of the push behavior with the 2 PID stages for you to compare, it is very distinct.

      Delete
    6. You mean you don't just give the set point as a particulat value, but you increase or decrease it gradually until this value is reached? I mean it doesn't jump from 0 to 100 but it is incremented in each loop as long as it isn't 100. (100 is only exemplary value).

      Delete
    7. Yes, just like that. See this video:
      https://www.youtube.com/watch?v=jL1LglQxQeQ

      See how the robot leans to counteract the disturbance, I don't see this behavior in yours.

      Delete
    8. This comment has been removed by the author.

      Delete
  21. Thank you very much for the video. You robot turns back to spot so slightly without any oscillations. My robot also turns back to spot after disturbance but with oscillations. Now I know how should it look like. Maybe I will tune this PID some day :) Thank you :)
    I would like to ask you only one more question. In situation when you don't control your robot (like on video), just it stays on the spot, the setpoint for speed PID loop is always 0?

    ReplyDelete
    Replies
    1. Yes, if is stationary the speed setpoint is 0.
      You are in the right direction, just needs small adjustments to make it work properly.

      Delete
    2. Hi. I succeded to make my robot move smoothly. It was just the matter of PID constats. Now I have my own PCB and Raspberry PI connected to it with Serial so I can tune it from my PC's keyboard (VNC serer). I've spent some ours and I achieved followin result:
      http://vimeo.com/113750618

      I think it works quite well. Now I can move on and add some visual algorythms with OpenCV. Thank you very much for your help :)

      Delete
  22. This comment has been removed by the author.

    ReplyDelete
  23. Olá Samuel, eu estou a contruir um robot, já consegui colocar o PID a funcionar com o MPU6050 filtro kalman.
    Estou com dificuldade em colocar os motores de passo com movimento suave o tal motion control PID, utilizo duas pontes H L298.
    O que posso fazer no arduino para melhorar os movimentos dos motores?
    Cumprimentos

    ReplyDelete
    Replies
    1. Olá Nuno,

      O que eu faço para tornar o movimento dos motores mais suave é variar a corrente de acordo com a necessidade de movimento. Uma corrente mais baixa torna os passos suaves mas perde força, uso quando o robo está parado e com velocidade baixa. Se for necessário velocidade elevada aumento a corrente no motor para que este possa aguentar. Espero que ajude

      Delete
  24. pls send program code and processing

    ReplyDelete
  25. Hi Samuel, i doont know how to change resolution to raspicam from python, could you help me?

    ReplyDelete
  26. Hi,

    your project is very impressive!
    where can get help with schematics and how to connects stuff?
    for example, how do i wire the Accelerometer and the gyro sensors to raspberry?

    thanks!

    ReplyDelete
  27. Seria uma boa idéia usar seu projeto com pi cam opencv python para controle pan tilt do meu drone?
    Você poderia me ajudar?

    ReplyDelete
  28. Seria uma boa idéia usar seu projeto com pi cam opencv python para controle pan tilt do meu drone?
    Você poderia me ajudar?

    ReplyDelete
  29. hallo Samuel
    my name Safri, from Indonesia
    i'm very interested with your project
    but, i have question, can i used this methode for number detection?
    Thank you for answer

    ReplyDelete
  30. Hello my name is Henrique

    I´m doing a self balance robot´s project.
    I saw that you tuned the PID gains by bluetooth.
    In PID Cascade you have 6 gains. How did you tune that? Wich on did you tune first?(Angle gains or speed gains?)
    Can you show me you technique ? I having problems to tune my cascade PID.

    Thanks

    ReplyDelete
  31. Ola Samuel, eu estou a fazer uma placa dedicada para o controlo dos steppers, e estou a uzar um Atmega328p, e tenho dificuldades em conectar os l297 ao microcontrolador, ha alguma possibilidade de me dares os schematics da tua board ?

    ReplyDelete
  32. can you please post a parts list?

    ReplyDelete
  33. Hello ,can you post or send me (e-mail:fot.farmakis@gmail.com)schematic for Balancing and motor control Board or a example can help .

    ReplyDelete
  34. This comment has been removed by the author.

    ReplyDelete
  35. This comment has been removed by the author.

    ReplyDelete
    Replies
    1. Great project !!!!
      Which version of Python and opencv you use ? And it is possible, to get the code for it ?

      Delete
  36. can you please elaborate on how you tuned the pid using bluetooth

    ReplyDelete
  37. Hey! Your earliest video of the balancing robot tracking and following the ball really impressed the heck out of me! Pretty soon afterwards I set about building a similar version of your bot, sans the self balancing part. Although I seem to have hit a big of a roadblock and was wondering if you could help me out. I've used an RPi2 + OpenCV + Python to build the project, and have come to the stage where the pan-tilt servos can accurately track a color filtered object. However, upon integrating the ultrasonic sensor for corresponding distance measurement, the RPi simply cannot handle the processing requirements, and the setup lags very badly. Do you have any thoughts?

    ReplyDelete
  38. Hi Samuel,

    I find your blog very inspiring and I would really like to get more updates. What plate did you use for the motor board?

    ReplyDelete
    Replies
    1. Hello Janina,
      I'm planing to return to this project, so you can expect some updates in the next months.
      What do you mean by plate?

      Delete
    2. Cool! I'm very excited :) I mean the motor board, which you designed by your self. Did you buy a special board/plate (don't know the english word for the green board) or how did you made it?

      Delete
    3. Now I get it. I didn't produce it myself (it is possible). I have designed it and send it for production on China, don't remember the website but I can check it if you want.

      Delete
  39. This comment has been removed by the author.

    ReplyDelete
  40. Hi Samuel, where you got the hex adapter? I can't find them (17mm hex to 5mm axis)

    ReplyDelete
  41. Hello,
    Check this link
    http://www.ebay.com/itm/Summit-HEX-HUBS-5353-nuts-splined-17mm-E-revo-E-maxx-brushless-3-3-Traxxas-5607-/371429288353?epid=1737666796&hash=item567ae929a1:g:bfoAAOSwJSJXF9an

    ReplyDelete
    Replies
    1. Thanks, unfortunately they don't ship to my country (Chile). I found similar items but the cost of shipping is like 4 times the cost of the item.

      Delete
    2. I have two that I can give to you, I don't need them. I need to check the cost of the shipping to Chile. I'm from Portugal.

      Delete
    3. Many thanks Samuel,

      I found an alternative adapter
      https://es.aliexpress.com/item/Black-Metal-Wheel-Hex-12mm-Turn-17mm-4P-RC-For-1-10-1-8-Buggy-Truck/32569879582.html?spm=a2g0s.9042311.0.0.l3G0Hu

      Delete
  42. I really like the "head" of your robot, and would like to use it in mine. Could you upload the 3d model for it?

    ReplyDelete
  43. How did you make the robit follow the ball? Did you use the ultrasonic sensor to get the distance fron the obstacle or used opencv to calculate distance from a known object?

    Basically how did to calculate the distance from the ball and robot?

    ReplyDelete
  44. Hello Samuel, quick question about the torque from the NEMA17.

    I am using drv8825 and when I have a 1/8 wheel on the motor's shaft I can quite easily stop the motor. Could you please tell me whether this is subjectively normal?

    Thanks,
    Pawel

    ReplyDelete
    Replies
    1. Hello Pawel,

      That kind of behavior means that your driving current is to low. If you use higher current you'll have more torque. Be careful when adjusting, the driver can become really hot or burn.

      Best,
      Samuel

      Delete
  45. Hey im working my final project about image processing, could u help me about the code "servo for tracking object by its sizes" ?

    ReplyDelete
  46. Hello Samuel
    Why a PIC24FJ64GA002 microcontroller is used ...?? And can we use any other microcontroller if we want..?

    ReplyDelete
  47. Hello Samuel
    Why a PIC24FJ64GA002 microcontroller
    is used in this project....?? And can we use any other micro controller ???

    ReplyDelete
    Replies
    1. Hello Himanshu, you can use any microcontroller that you want. There was no special reason to use a PIC. I'm not using PIC anymore, now I'm using STM32.

      Delete
  48. Hello Samuel, thanks you for sharing your solution about the robot, can you tell me how to control the servo attached to the camera? thanks again

    ReplyDelete
  49. Can you please send me the full code and schematic to my mail ID: maheshjshetty@gmail.com

    ReplyDelete
  50. Hi Samuel
    I am starting with this project. You can tell me when processing the image, I will have the coordinates of the ball. So how can I control the motor to move with it?
    Using the PIC microcontroller is to balance the robot and you use the Rasbbery Pi to control it to move?

    ReplyDelete
  51. Slide small cooking pot in the cable to make it easier for you to link the other big wooden bead for the conclude with the cord. 파워볼사이트

    ReplyDelete