Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pull from Fork #1

Open
wants to merge 5 commits into
base: config
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions Apriltags_Notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
* installed the [Apriltags ROS package](http://wiki.ros.org/apriltag_ros)
* Takes in a camera feed
* Publishes a TF location for an Apriltag, so can be shown on RViz
* (need to research TF more to understand how to parse this/transform into IK)
* The Apriltag must be listed in the `config/tags.yaml`
* must list 'size' of the tag in real life, ID of tag
* Currently loaded on machine is only the 36h11 family tag, ID 1
* Run the video Apriltag detection with `roslaunch apriltags2_ros continuous_detection.launch`
* the camera & robot should be running first so the topics exist


Apriltag also publishes to `/tag_detections` which seems to contain a pose as an XYZ point/XYZW orientation. Possibly easer to use. Messages look like:
```
header:
seq: 2448
stamp:
secs: 1661795935
nsecs: 923418178
frame_id: "rgb_camera_link"
detections:
-
id: [1]
size: [0.1125]
pose:
header:
seq: 8106
stamp:
secs: 1661795935
nsecs: 923418178
frame_id: "rgb_camera_link"
pose:
pose:
position:
x: -0.4604387282265957
y: -0.34909099407950955
z: 1.0812755767473274
orientation:
x: -0.6874761962378958
y: 0.7260056597955421
z: 0.008088762109603964
w: 0.015060991954323812
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
```
Pose values seem to be constant to two decimal points.
52 changes: 46 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,42 @@
# Gentle_Pick_and_Place
Pick and place demo for gentle gripper.

## Physical setup
The robot currently hard-defines the bins as ~15 inches to the left and right of the left side of the robot's base. (That is, when facing away from the robot, there's a bin to the left of the base 15 inches away from the left side of the base, and there's a bin to the right of the base 15 inches away from the left side of the base.) The bins have a 14-inch long side and a 9-inch short side.

The bin on the left side of the robot is considered the 'bottles' bin, and must be placed with the *long* side facing the robot. The bin on the right, meanwhile, is considered the 'cans' bin, and should be placed with the *short* side facing the robot.

Bottles or cans should be placed within approximately a 2-foot-by-2-foot square directly in front of the robot.

diagram:
```
---------------------------------------------------------------------------------
| [--------------] [XXXX] [-------] |
| [ ] [XXXX] [ ] |
| [ C ] [XXXX] [ ] |
| [ ] ^ [ B ] |
| [______________] < --- ~15 inches --- > | < --- ~15 inches --- > [ ] |
| | [ ] |
| | [_______] |
| |
| < --- objs within ~2 feet from from base ---> |
| |
| ccc | bbbbbb |
| ccc | bbbbbb |
| ccc | |
| v |
| |
| |
| |
---------------------------------------------------------------------------------

c: can
C: can bin
b: bottle
B: bottle bin
X: robot base
```

## Running instructions
Open four terminals and run the source command in each terminal:
```
Expand All @@ -11,19 +47,23 @@ Then run the following commands in each terminal by the same order:
```
roslaunch pick_and_place ur3e.launch
```
After this command, you need to start the `gentle_pick` program on the robot teaching pendant.
2. After this command, you need to start the `gentle_pick` program on the robot teaching pendant.
- Start the robot teaching pendant by pressing the power button.
- Once the pendant is on, tap on "load program" in the middle left, and select `gentle_pick.urp` from the new window.
- In the bottom left corner, click the red "power off button" to open the initialization screen. Click "on" to go to idle mode, and again to go to operatopnal mode.
- Exit the initialization screen and press the play button in the lower right.

2. Start the camera node
3. Start the camera node
```
roslaunch pick_and_place kinect.launch
```
3. Start the object clustering server
4. Start the object clustering server
```
rosrun pick_and_place object_clustering_server
```
4. Start the pick and place demo
5. Start the pick and place demo
```
rosrun pick_and_place pick_and_place.py
```
If there is no item left on the talbe, the 4th program will automatically stop.
You need to manually add items back to the table and rerun the 4th command in the same window.
If there is no item left on the talbe, the python program will automatically stop.
You need to manually add items back to the table and rerun the last command in the same window.