15/9/2024
Spent the whole day adjusting the x, y, z values and combining the object detection module to moving the arm to the detected object.
During one of the experiments, the end effector twisted in an unexpected way. The camera cable got yanked out and the tip broke, ending my session prematurely.
Thought of ways to avoid this from happening again:
1. Unplug cable while testing robot's movement
2. Loosen the cable/ give it more leeway.
16/9/2024
Managed to complete the entire process of object detection and picking up object (single object). Based on my observation, when the object is nearer to the camera, the UR10 is able to pick up the object (1st position to 5th position), but the further away from the camera, the accuracy drops a lot.
I had a discussion with my supervisor about testing and evaluating. After further deliberation, I have decided to test the accuracy in the following manner:
Object position based on camera vs object position based on robot and the difference between the actual position and the position obtained by the robot.
At the moment, I am just working on vertical position and not horizontal position.
18/9/2024
Worked on the grippers. The system can now move to object's location and pick it up and place it onto a person's palm. However accuracy in placing on a person's palm seems to be affected by whether the whole palm is in the camera frame or not. When it isn't, the x and y value becomes completely inaccurate (even going the opposite direction of the palm).
20/9/2024
Today, I worked on getting the gripper to rotate according to the orientation of the object. I have tried getting the system to recognise the orientation of the object and then calculating the rx, ry and rz based on the camera coordinates, but the end effector ended up twisting away from the object.
Therefore, I came up with a simpler solution: For now, the system will only determine if the object was placed horizontally or vertically (based on the bounding box) and then the gripper would rotate at a fixed value depending on the orientation of the object.
21/9/2024
For the rotation of object to be placed on the palm, I discovered that I could use the media pipe (MediaPipe Hands), library to estimate 3D hand poses.
(Ref: https://mediapipe.readthedocs.io/en/latest/solutions/hands.html)
The idea was to have the object rotate 90 degrees away from the middle finger. However, it meant having a dynamic rx, ry and rz for the end effector. The result was as below
Since I am unable to rectify the issue of having dynamic rx, ry and rz without the robot flipping in the above manner, and recognising that there is not enough time to fully understand and explore the MediaPipe library, I decided to forgo using this method.
Instead, the same method that was used for the object detection and orientation was used for the palm - by recognising the orientation of the bounding box (rectangle), the end effector will rotate by 90 degrees (either horizontal or vertical).
Started the data collection for analysis by testing the number of times the system is able to complete the entire process of picking object and placing the object. 10 different coordinates and 2 types of orientation (horizontal and vertical) was tested for both the object and the palm:
The number of tests will be increased to 20 tomorrow to provide better understanding of the system.