Skip to content

manipulation_robot

ManipulationRobot

Bases: BaseRobot

Robot that is is equipped with grasping (manipulation) capabilities. Provides common interface for a wide variety of robots.

controller_config should, at the minimum, contain:

arm: controller specifications for the controller to control this robot's arm (manipulation). Should include:

- name: Controller to create
- <other kwargs> relevant to the controller being created. Note that all values will have default
    values specified, but setting these individual kwargs will override them
Source code in omnigibson/robots/manipulation_robot.py
  50
  51
  52
  53
  54
  55
  56
  57
  58
  59
  60
  61
  62
  63
  64
  65
  66
  67
  68
  69
  70
  71
  72
  73
  74
  75
  76
  77
  78
  79
  80
  81
  82
  83
  84
  85
  86
  87
  88
  89
  90
  91
  92
  93
  94
  95
  96
  97
  98
  99
 100
 101
 102
 103
 104
 105
 106
 107
 108
 109
 110
 111
 112
 113
 114
 115
 116
 117
 118
 119
 120
 121
 122
 123
 124
 125
 126
 127
 128
 129
 130
 131
 132
 133
 134
 135
 136
 137
 138
 139
 140
 141
 142
 143
 144
 145
 146
 147
 148
 149
 150
 151
 152
 153
 154
 155
 156
 157
 158
 159
 160
 161
 162
 163
 164
 165
 166
 167
 168
 169
 170
 171
 172
 173
 174
 175
 176
 177
 178
 179
 180
 181
 182
 183
 184
 185
 186
 187
 188
 189
 190
 191
 192
 193
 194
 195
 196
 197
 198
 199
 200
 201
 202
 203
 204
 205
 206
 207
 208
 209
 210
 211
 212
 213
 214
 215
 216
 217
 218
 219
 220
 221
 222
 223
 224
 225
 226
 227
 228
 229
 230
 231
 232
 233
 234
 235
 236
 237
 238
 239
 240
 241
 242
 243
 244
 245
 246
 247
 248
 249
 250
 251
 252
 253
 254
 255
 256
 257
 258
 259
 260
 261
 262
 263
 264
 265
 266
 267
 268
 269
 270
 271
 272
 273
 274
 275
 276
 277
 278
 279
 280
 281
 282
 283
 284
 285
 286
 287
 288
 289
 290
 291
 292
 293
 294
 295
 296
 297
 298
 299
 300
 301
 302
 303
 304
 305
 306
 307
 308
 309
 310
 311
 312
 313
 314
 315
 316
 317
 318
 319
 320
 321
 322
 323
 324
 325
 326
 327
 328
 329
 330
 331
 332
 333
 334
 335
 336
 337
 338
 339
 340
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
class ManipulationRobot(BaseRobot):
    """
    Robot that is is equipped with grasping (manipulation) capabilities.
    Provides common interface for a wide variety of robots.

    NOTE: controller_config should, at the minimum, contain:
        arm: controller specifications for the controller to control this robot's arm (manipulation).
            Should include:

            - name: Controller to create
            - <other kwargs> relevant to the controller being created. Note that all values will have default
                values specified, but setting these individual kwargs will override them
    """

    def __init__(
        self,
        # Shared kwargs in hierarchy
        name,
        relative_prim_path=None,
        scale=None,
        visible=True,
        fixed_base=False,
        visual_only=False,
        self_collisions=False,
        load_config=None,
        # Unique to USDObject hierarchy
        abilities=None,
        # Unique to ControllableObject hierarchy
        control_freq=None,
        controller_config=None,
        action_type="continuous",
        action_normalize=True,
        reset_joint_pos=None,
        # Unique to BaseRobot
        obs_modalities=("rgb", "proprio"),
        proprio_obs="default",
        sensor_config=None,
        # Unique to ManipulationRobot
        grasping_mode="physical",
        grasping_direction="lower",
        disable_grasp_handling=False,
        **kwargs,
    ):
        """
        Args:
            name (str): Name for the object. Names need to be unique per scene
            relative_prim_path (str): Scene-local prim path of the Prim to encapsulate or create.
            scale (None or float or 3-array): if specified, sets either the uniform (float) or x,y,z (3-array) scale
                for this object. A single number corresponds to uniform scaling along the x,y,z axes, whereas a
                3-array specifies per-axis scaling.
            visible (bool): whether to render this object or not in the stage
            fixed_base (bool): whether to fix the base of this object or not
            visual_only (bool): Whether this object should be visual only (and not collide with any other objects)
            self_collisions (bool): Whether to enable self collisions for this object
            load_config (None or dict): If specified, should contain keyword-mapped values that are relevant for
                loading this prim at runtime.
            abilities (None or dict): If specified, manually adds specific object states to this object. It should be
                a dict in the form of {ability: {param: value}} containing object abilities and parameters to pass to
                the object state instance constructor.
            control_freq (float): control frequency (in Hz) at which to control the object. If set to be None,
                we will automatically set the control frequency to be at the render frequency by default.
            controller_config (None or dict): nested dictionary mapping controller name(s) to specific controller
                configurations for this object. This will override any default values specified by this class.
            action_type (str): one of {discrete, continuous} - what type of action space to use
            action_normalize (bool): whether to normalize inputted actions. This will override any default values
                specified by this class.
            reset_joint_pos (None or n-array): if specified, should be the joint positions that the object should
                be set to during a reset. If None (default), self._default_joint_pos will be used instead.
            obs_modalities (str or list of str): Observation modalities to use for this robot. Default is "all", which
                corresponds to all modalities being used.
                Otherwise, valid options should be part of omnigibson.sensors.ALL_SENSOR_MODALITIES.
                Note: If @sensor_config explicitly specifies `modalities` for a given sensor class, it will
                    override any values specified from @obs_modalities!
            proprio_obs (str or list of str): proprioception observation key(s) to use for generating proprioceptive
                observations. If str, should be exactly "default" -- this results in the default proprioception
                observations being used, as defined by self.default_proprio_obs. See self._get_proprioception_dict
                for valid key choices
            sensor_config (None or dict): nested dictionary mapping sensor class name(s) to specific sensor
                configurations for this object. This will override any default values specified by this class.
            grasping_mode (str): One of {"physical", "assisted", "sticky"}.
                If "physical", no assistive grasping will be applied (relies on contact friction + finger force).
                If "assisted", will magnetize any object touching and within the gripper's fingers. In this mode,
                    at least two "fingers" need to touch the object.
                If "sticky", will magnetize any object touching the gripper's fingers. In this mode, only one finger
                    needs to touch the object.
            grasping_direction (str): One of {"lower", "upper"}. If "lower", lower limit represents a closed grasp,
                otherwise upper limit represents a closed grasp.
            disable_grasp_handling (bool): If True, the robot will not automatically handle assisted or sticky grasps.
                Instead, you will need to call the grasp handling methods yourself.
            kwargs (dict): Additional keyword arguments that are used for other super() calls from subclasses, allowing
                for flexible compositions of various object subclasses (e.g.: Robot is USDObject + ControllableObject).
        """
        # Store relevant internal vars
        assert_valid_key(key=grasping_mode, valid_keys=AG_MODES, name="grasping_mode")
        assert_valid_key(key=grasping_direction, valid_keys=["lower", "upper"], name="grasping direction")
        self._grasping_mode = grasping_mode
        self._grasping_direction = grasping_direction
        self._disable_grasp_handling = disable_grasp_handling

        # Initialize other variables used for assistive grasping
        self._ag_freeze_joint_pos = {
            arm: {} for arm in self.arm_names
        }  # Frozen positions for keeping fingers held still
        self._ag_obj_in_hand = {arm: None for arm in self.arm_names}
        self._ag_obj_constraints = {arm: None for arm in self.arm_names}
        self._ag_obj_constraint_params = {arm: {} for arm in self.arm_names}
        self._ag_freeze_gripper = {arm: None for arm in self.arm_names}
        self._ag_release_counter = {arm: None for arm in self.arm_names}
        self._ag_check_in_volume = {arm: None for arm in self.arm_names}
        self._ag_calculate_volume = {arm: None for arm in self.arm_names}

        # Call super() method
        super().__init__(
            relative_prim_path=relative_prim_path,
            name=name,
            scale=scale,
            visible=visible,
            fixed_base=fixed_base,
            visual_only=visual_only,
            self_collisions=self_collisions,
            load_config=load_config,
            abilities=abilities,
            control_freq=control_freq,
            controller_config=controller_config,
            action_type=action_type,
            action_normalize=action_normalize,
            reset_joint_pos=reset_joint_pos,
            obs_modalities=obs_modalities,
            proprio_obs=proprio_obs,
            sensor_config=sensor_config,
            **kwargs,
        )

    def _validate_configuration(self):
        # Iterate over all arms
        for arm in self.arm_names:
            # We make sure that our arm controller exists and is a manipulation controller
            assert (
                "arm_{}".format(arm) in self._controllers
            ), "Controller 'arm_{}' must exist in controllers! Current controllers: {}".format(
                arm, list(self._controllers.keys())
            )
            assert isinstance(
                self._controllers["arm_{}".format(arm)], ManipulationController
            ), "Arm {} controller must be a ManipulationController!".format(arm)

            # We make sure that our gripper controller exists and is a gripper controller
            assert (
                "gripper_{}".format(arm) in self._controllers
            ), "Controller 'gripper_{}' must exist in controllers! Current controllers: {}".format(
                arm, list(self._controllers.keys())
            )
            assert isinstance(
                self._controllers["gripper_{}".format(arm)], GripperController
            ), "Gripper {} controller must be a GripperController!".format(arm)

        # run super
        super()._validate_configuration()

    def _initialize(self):
        super()._initialize()

        if gm.AG_CLOTH:
            for arm in self.arm_names:
                self._ag_check_in_volume[arm], self._ag_calculate_volume[arm] = (
                    generate_points_in_volume_checker_function(
                        obj=self, volume_link=self.eef_links[arm], mesh_name_prefixes="container"
                    )
                )

    def is_grasping(self, arm="default", candidate_obj=None):
        """
        Returns True if the robot is grasping the target option @candidate_obj or any object if @candidate_obj is None.

        Args:
            arm (str): specific arm to check for grasping. Default is "default" which corresponds to the first entry
                in self.arm_names
            candidate_obj (StatefulObject or None): object to check if this robot is currently grasping. If None, then
                will be a general (object-agnostic) check for grasping.
                Note: if self.grasping_mode is "physical", then @candidate_obj will be ignored completely

        Returns:
            IsGraspingState: For the specific manipulator appendage, returns IsGraspingState.TRUE if it is grasping
                (potentially @candidate_obj if specified), IsGraspingState.FALSE if it is not grasping,
                and IsGraspingState.UNKNOWN if unknown.
        """
        arm = self.default_arm if arm == "default" else arm
        if self.grasping_mode != "physical":
            is_grasping_obj = (
                self._ag_obj_in_hand[arm] is not None
                if candidate_obj is None
                else self._ag_obj_in_hand[arm] == candidate_obj
            )
            is_grasping = (
                IsGraspingState.TRUE
                if is_grasping_obj and self._ag_release_counter[arm] is None
                else IsGraspingState.FALSE
            )
        else:
            # Infer from the gripper controller the state
            is_grasping = self._controllers["gripper_{}".format(arm)].is_grasping()
            # If candidate obj is not None, we also check to see if our fingers are in contact with the object
            if is_grasping == IsGraspingState.TRUE and candidate_obj is not None:
                finger_links = {link for link in self.finger_links[arm]}
                is_grasping = len(candidate_obj.states[ContactBodies].get_value().intersection(finger_links)) > 0

        return is_grasping

    def _find_gripper_contacts(self, arm="default", return_contact_positions=False):
        """
        For arm @arm, calculate any body IDs and corresponding link IDs that are not part of the robot
        itself that are in contact with any of this arm's gripper's fingers
        Args:
            arm (str): specific arm whose gripper will be checked for contact. Default is "default" which
                corresponds to the first entry in self.arm_names
            return_contact_positions (bool): if True, will additionally return the contact (x,y,z) position
        Returns:
            2-tuple:
                - set: set of unique contact prim_paths that are not the robot self-collisions.
                    If @return_contact_positions is True, then returns (prim_path, pos), where pos is the contact
                    (x,y,z) position
                    Note: if no objects that are not the robot itself are intersecting, the set will be empty.
                - dict: dictionary mapping unique contact objects defined by the contact prim_path to
                    set of unique robot link prim_paths that it is in contact with
        """
        arm = self.default_arm if arm == "default" else arm

        # Get robot finger links
        finger_paths = set([link.prim_path for link in self.finger_links[arm]])

        # Get robot links
        link_paths = set(self.link_prim_paths)

        if not return_contact_positions:
            raw_contact_data = {
                (row, col)
                for row, col in GripperRigidContactAPI.get_contact_pairs(self.scene.idx, column_prim_paths=finger_paths)
                if row not in link_paths
            }
        else:
            raw_contact_data = {
                (row, col, point)
                for row, col, force, point, normal, sep in GripperRigidContactAPI.get_contact_data(
                    self.scene.idx, column_prim_paths=finger_paths
                )
                if row not in link_paths
            }

        # Translate to robot contact data
        robot_contact_links = dict()
        contact_data = set()
        for con_data in raw_contact_data:
            if not return_contact_positions:
                other_contact, link_contact = con_data
                contact_data.add(other_contact)
            else:
                other_contact, link_contact, point = con_data
                contact_data.add((other_contact, point))
            if other_contact not in robot_contact_links:
                robot_contact_links[other_contact] = set()
            robot_contact_links[other_contact].add(link_contact)

        return contact_data, robot_contact_links

    def set_position_orientation(
        self, position=None, orientation=None, frame: Literal["world", "parent", "scene"] = "world"
    ):
        """
        Sets manipulation robot's pose with respect to the specified frame

        Args:
            position (None or 3-array): if specified, (x,y,z) position in the world frame
                Default is None, which means left unchanged.
            orientation (None or 4-array): if specified, (x,y,z,w) quaternion orientation in the world frame.
                Default is None, which means left unchanged.
            frame (Literal): frame to set the pose with respect to, defaults to "world".parent frame
            set position relative to the object parent. scene frame set position relative to the scene.
        """

        # Store the original EEF poses.
        original_poses = {}
        for arm in self.arm_names:
            original_poses[arm] = (self.get_eef_position(arm), self.get_eef_orientation(arm))

        # Run the super method
        super().set_position_orientation(position=position, orientation=orientation, frame=frame)

        # Now for each hand, if it was holding an AG object, teleport it.
        for arm in self.arm_names:
            if self._ag_obj_in_hand[arm] is not None:
                original_eef_pose = T.pose2mat(original_poses[arm])
                inv_original_eef_pose = T.pose_inv(pose_mat=original_eef_pose)
                original_obj_pose = T.pose2mat(self._ag_obj_in_hand[arm].get_position_orientation())
                new_eef_pose = T.pose2mat((self.get_eef_position(arm), self.get_eef_orientation(arm)))
                # New object pose is transform:
                # original --> "De"transform the original EEF pose --> "Re"transform the new EEF pose
                new_obj_pose = new_eef_pose @ inv_original_eef_pose @ original_obj_pose
                self._ag_obj_in_hand[arm].set_position_orientation(*T.mat2pose(hmat=new_obj_pose))

    def deploy_control(self, control, control_type):
        # We intercept the gripper control and replace it with the current joint position if we're freezing our gripper
        for arm in self.arm_names:
            if self._ag_freeze_gripper[arm]:
                control[self.gripper_control_idx[arm]] = (
                    self._ag_obj_constraint_params[arm]["gripper_pos"]
                    if self.controllers[f"gripper_{arm}"].control_type == ControlType.POSITION
                    else 0.0
                )

        super().deploy_control(control=control, control_type=control_type)

        # Then run assisted grasping
        if self.grasping_mode != "physical" and not self._disable_grasp_handling:
            self._handle_assisted_grasping()

        # Potentially freeze gripper joints
        for arm in self.arm_names:
            if self._ag_freeze_gripper[arm]:
                self._freeze_gripper(arm)

    def _release_grasp(self, arm="default"):
        """
        Magic action to release this robot's grasp on an object

        Args:
            arm (str): specific arm whose grasp will be released.
                Default is "default" which corresponds to the first entry in self.arm_names
        """
        arm = self.default_arm if arm == "default" else arm

        # Remove joint and filtered collision restraints
        og.sim.stage.RemovePrim(self._ag_obj_constraint_params[arm]["ag_joint_prim_path"])
        self._ag_obj_constraints[arm] = None
        self._ag_obj_constraint_params[arm] = {}
        self._ag_freeze_gripper[arm] = False
        self._ag_release_counter[arm] = 0

    def release_grasp_immediately(self):
        """
        Magic action to release this robot's grasp for all arms at once.
        As opposed to @_release_grasp, this method would bypass the release window mechanism and immediately release.
        """
        for arm in self.arm_names:
            if self._ag_obj_constraints[arm] is not None:
                self._release_grasp(arm=arm)
                self._ag_release_counter[arm] = int(math.ceil(m.RELEASE_WINDOW / og.sim.get_sim_step_dt()))
                self._handle_release_window(arm=arm)
                assert not self._ag_obj_in_hand[arm], "Object still in ag list after release!"
                # TODO: Verify not needed!
                # for finger_link in self.finger_links[arm]:
                #     finger_link.remove_filtered_collision_pair(prim=self._ag_obj_in_hand[arm])

    def get_control_dict(self):
        # In addition to super method, add in EEF states
        fcns = super().get_control_dict()

        for arm in self.arm_names:
            self._add_arm_control_dict(fcns=fcns, arm=arm)

        return fcns

    def _add_arm_control_dict(self, fcns, arm):
        """
        Internally helper function to generate per-arm control dictionary entries. Needed because otherwise generated
        functions inadvertently point to the same arm, if directly iterated in a for loop!

        Args:
            fcns (CachedFunctions): Keyword-mapped control values for this object, mapping names to n-arrays.
            arm (str): specific arm to generate necessary control dict entries for
        """
        fcns[f"_eef_{arm}_pos_quat_relative"] = (
            lambda: ControllableObjectViewAPI.get_link_relative_position_orientation(
                self.articulation_root_path, self.eef_link_names[arm]
            )
        )
        fcns[f"eef_{arm}_pos_relative"] = lambda: fcns[f"_eef_{arm}_pos_quat_relative"][0]
        fcns[f"eef_{arm}_quat_relative"] = lambda: fcns[f"_eef_{arm}_pos_quat_relative"][1]
        fcns[f"eef_{arm}_lin_vel_relative"] = lambda: ControllableObjectViewAPI.get_link_relative_linear_velocity(
            self.articulation_root_path, self.eef_link_names[arm]
        )
        fcns[f"eef_{arm}_ang_vel_relative"] = lambda: ControllableObjectViewAPI.get_link_relative_angular_velocity(
            self.articulation_root_path, self.eef_link_names[arm]
        )
        # -n_joints because there may be an additional 6 entries at the beginning of the array, if this robot does
        # not have a fixed base (i.e.: the 6DOF --> "floating" joint)
        # see self.get_relative_jacobian() for more info
        # We also count backwards for the link frame because if the robot is fixed base, the jacobian returned has one
        # less index than the number of links. This is presumably because the 1st link of a fixed base robot will
        # always have a zero jacobian since it can't move. Counting backwards resolves this issue.
        start_idx = 0 if self.fixed_base else 6
        eef_link_idx = self._articulation_view.get_body_index(self.eef_links[arm].body_name)
        fcns[f"eef_{arm}_jacobian_relative"] = lambda: ControllableObjectViewAPI.get_relative_jacobian(
            self.articulation_root_path
        )[-(self.n_links - eef_link_idx), :, start_idx : start_idx + self.n_joints]

    def _get_proprioception_dict(self):
        dic = super()._get_proprioception_dict()

        # Loop over all arms to grab proprio info
        joint_positions = ControllableObjectViewAPI.get_joint_positions(self.articulation_root_path)
        joint_velocities = ControllableObjectViewAPI.get_joint_velocities(self.articulation_root_path)
        for arm in self.arm_names:
            # Add arm info
            dic["arm_{}_qpos".format(arm)] = joint_positions[self.arm_control_idx[arm]]
            dic["arm_{}_qpos_sin".format(arm)] = th.sin(joint_positions[self.arm_control_idx[arm]])
            dic["arm_{}_qpos_cos".format(arm)] = th.cos(joint_positions[self.arm_control_idx[arm]])
            dic["arm_{}_qvel".format(arm)] = joint_velocities[self.arm_control_idx[arm]]

            # Add eef and grasping info
            dic["eef_{}_pos".format(arm)], dic["eef_{}_quat".format(arm)] = (
                ControllableObjectViewAPI.get_link_relative_position_orientation(
                    self.articulation_root_path, self.eef_link_names[arm]
                )
            )
            dic["grasp_{}".format(arm)] = th.tensor([self.is_grasping(arm)])
            dic["gripper_{}_qpos".format(arm)] = joint_positions[self.gripper_control_idx[arm]]
            dic["gripper_{}_qvel".format(arm)] = joint_velocities[self.gripper_control_idx[arm]]

        return dic

    @property
    def default_proprio_obs(self):
        obs_keys = super().default_proprio_obs
        for arm in self.arm_names:
            obs_keys += [
                "arm_{}_qpos_sin".format(arm),
                "arm_{}_qpos_cos".format(arm),
                "eef_{}_pos".format(arm),
                "eef_{}_quat".format(arm),
                "gripper_{}_qpos".format(arm),
                "grasp_{}".format(arm),
            ]
        return obs_keys

    @property
    def grasping_mode(self):
        """
        Grasping mode of this robot. Is one of AG_MODES

        Returns:
            str: Grasping mode for this robot
        """
        return self._grasping_mode

    @property
    def controller_order(self):
        # Assumes we have arm(s) and corresponding gripper(s)
        controllers = []
        for arm in self.arm_names:
            controllers += ["arm_{}".format(arm), "gripper_{}".format(arm)]

        return controllers

    @property
    def _default_controllers(self):
        # Always call super first
        controllers = super()._default_controllers

        # For best generalizability use, joint controller as default
        for arm in self.arm_names:
            controllers["arm_{}".format(arm)] = "JointController"
            controllers["gripper_{}".format(arm)] = "JointController"

        return controllers

    @classproperty
    def n_arms(cls):
        """
        Returns:
            int: Number of arms this robot has. Returns 1 by default
        """
        return 1

    @classproperty
    def arm_names(cls):
        """
        Returns:
            list of str: List of arm names for this robot. Should correspond to the keys used to index into
                arm- and gripper-related dictionaries, e.g.: eef_link_names, finger_link_names, etc.
                Default is string enumeration based on @self.n_arms.
        """
        return [str(i) for i in range(cls.n_arms)]

    @property
    def default_arm(self):
        """
        Returns:
            str: Default arm name for this robot, corresponds to the first entry in @arm_names by default
        """
        return self.arm_names[0]

    @property
    def arm_action_idx(self):
        arm_action_idx = {}
        for arm_name in self.arm_names:
            controller_idx = self.controller_order.index(f"arm_{arm_name}")
            action_start_idx = sum(
                [self.controllers[self.controller_order[i]].command_dim for i in range(controller_idx)]
            )
            arm_action_idx[arm_name] = th.arange(
                action_start_idx, action_start_idx + self.controllers[f"arm_{arm_name}"].command_dim
            )
        return arm_action_idx

    @property
    def gripper_action_idx(self):
        gripper_action_idx = {}
        for arm_name in self.arm_names:
            controller_idx = self.controller_order.index(f"gripper_{arm_name}")
            action_start_idx = sum(
                [self.controllers[self.controller_order[i]].command_dim for i in range(controller_idx)]
            )
            gripper_action_idx[arm_name] = th.arange(
                action_start_idx, action_start_idx + self.controllers[f"gripper_{arm_name}"].command_dim
            )
        return gripper_action_idx

    @property
    @abstractmethod
    def arm_link_names(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to corresponding arm link names,
                should correspond to specific link names in this robot's underlying model file

                Note: the ordering within the dictionary is assumed to be intentional, and is
                directly used to define the set of corresponding idxs.
        """
        raise NotImplementedError

    @property
    @abstractmethod
    def arm_joint_names(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to corresponding arm joint names,
                should correspond to specific joint names in this robot's underlying model file

                Note: the ordering within the dictionary is assumed to be intentional, and is
                directly used to define the set of corresponding control idxs.
        """
        raise NotImplementedError

    @property
    @abstractmethod
    def eef_link_names(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to corresponding name of the EEF link,
                should correspond to specific link name in this robot's underlying model file
        """
        raise NotImplementedError

    @property
    @abstractmethod
    def finger_link_names(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to array of link names corresponding to
                this robot's fingers

                Note: the ordering within the dictionary is assumed to be intentional, and is
                directly used to define the set of corresponding idxs.
        """
        raise NotImplementedError

    @property
    @abstractmethod
    def finger_joint_names(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to array of joint names corresponding to
                this robot's fingers.

                Note: the ordering within the dictionary is assumed to be intentional, and is
                directly used to define the set of corresponding control idxs.
        """
        raise NotImplementedError

    @property
    def arm_control_idx(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to indices in low-level control
                vector corresponding to arm joints.
        """
        return {
            arm: th.tensor([list(self.joints.keys()).index(name) for name in self.arm_joint_names[arm]])
            for arm in self.arm_names
        }

    @property
    def gripper_control_idx(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to indices in low-level control
                vector corresponding to gripper joints.
        """
        return {
            arm: th.tensor([list(self.joints.keys()).index(name) for name in self.finger_joint_names[arm]])
            for arm in self.arm_names
        }

    @property
    def arm_links(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to robot links corresponding to
                that arm's links
        """
        return {arm: [self._links[link] for link in self.arm_link_names[arm]] for arm in self.arm_names}

    @property
    def eef_links(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to robot link corresponding to that arm's
                eef link
        """
        return {arm: self._links[self.eef_link_names[arm]] for arm in self.arm_names}

    @property
    def finger_links(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to robot links corresponding to
                that arm's finger links
        """
        return {arm: [self._links[link] for link in self.finger_link_names[arm]] for arm in self.arm_names}

    @property
    def finger_joints(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to robot joints corresponding to
                that arm's finger joints
        """
        return {arm: [self._joints[joint] for joint in self.finger_joint_names[arm]] for arm in self.arm_names}

    @property
    def assisted_grasp_start_points(self):
        """
        Returns:
            dict: Dictionary mapping individual arm appendage names to array of GraspingPoint tuples,
                composed of (link_name, position) values specifying valid grasping start points located at
                cartesian (x,y,z) coordinates specified in link_name's local coordinate frame.
                These values will be used in conjunction with
                @self.assisted_grasp_end_points to trigger assisted grasps, where objects that intersect
                with any ray starting at any point in @self.assisted_grasp_start_points and terminating at any point in
                @self.assisted_grasp_end_points will trigger an assisted grasp (calculated individually for each gripper
                appendage). By default, each entry returns None, and must be implemented by any robot subclass that
                wishes to use assisted grasping.
        """
        return {arm: None for arm in self.arm_names}

    @property
    def assisted_grasp_end_points(self):
        """
        Returns:
            dict: Dictionary mapping individual arm appendage names to array of GraspingPoint tuples,
                composed of (link_name, position) values specifying valid grasping end points located at
                cartesian (x,y,z) coordinates specified in link_name's local coordinate frame.
                These values will be used in conjunction with
                @self.assisted_grasp_start_points to trigger assisted grasps, where objects that intersect
                with any ray starting at any point in @self.assisted_grasp_start_points and terminating at any point in
                @self.assisted_grasp_end_points will trigger an assisted grasp (calculated individually for each gripper
                appendage). By default, each entry returns None, and must be implemented by any robot subclass that
                wishes to use assisted grasping.
        """
        return {arm: None for arm in self.arm_names}

    @property
    def finger_lengths(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to corresponding length of the fingers in that
                hand defined from the palm (assuming all fingers in one hand are equally long)
        """
        raise NotImplementedError

    @property
    def arm_workspace_range(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to a tuple indicating the start and end of the
                angular range of the arm workspace around the Z axis of the robot, where 0 is facing
                forward.
        """
        raise NotImplementedError

    def get_eef_position(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab eef position. Default is "default" which corresponds to the first entry
                in self.arm_names

        Returns:
            3-array: (x,y,z) global end-effector Cartesian position for this robot's end-effector corresponding
                to arm @arm
        """
        arm = self.default_arm if arm == "default" else arm
        return self._links[self.eef_link_names[arm]].get_position_orientation()[0]

    def get_eef_orientation(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab eef orientation. Default is "default" which corresponds to the first entry
                in self.arm_names

        Returns:
            3-array: (x,y,z,w) global quaternion orientation for this robot's end-effector corresponding
                to arm @arm
        """
        arm = self.default_arm if arm == "default" else arm
        return self._links[self.eef_link_names[arm]].get_position_orientation()[1]

    def get_relative_eef_pose(self, arm="default", mat=False):
        """
        Args:
            arm (str): specific arm to grab eef pose. Default is "default" which corresponds to the first entry
                in self.arm_names
            mat (bool): whether to return pose in matrix form (mat=True) or (pos, quat) tuple (mat=False)

        Returns:
            2-tuple or (4, 4)-array: End-effector pose, either in 4x4 homogeneous
                matrix form (if @mat=True) or (pos, quat) tuple (if @mat=False), corresponding to arm @arm
        """
        arm = self.default_arm if arm == "default" else arm
        eef_link_pose = self.eef_links[arm].get_position_orientation()
        base_link_pose = self.get_position_orientation()
        pose = T.relative_pose_transform(*eef_link_pose, *base_link_pose)
        return T.pose2mat(pose) if mat else pose

    def get_relative_eef_position(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab relative eef pos.
                Default is "default" which corresponds to the first entry in self.arm_names


        Returns:
            3-array: (x,y,z) Cartesian position of end-effector relative to robot base frame
        """
        arm = self.default_arm if arm == "default" else arm
        return self.get_relative_eef_pose(arm=arm)[0]

    def get_relative_eef_orientation(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab relative eef orientation.
                Default is "default" which corresponds to the first entry in self.arm_names

        Returns:
            4-array: (x,y,z,w) quaternion orientation of end-effector relative to robot base frame
        """
        arm = self.default_arm if arm == "default" else arm
        return self.get_relative_eef_pose(arm=arm)[1]

    def get_relative_eef_lin_vel(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab relative eef linear velocity.
                Default is "default" which corresponds to the first entry in self.arm_names


        Returns:
            3-array: (x,y,z) Linear velocity of end-effector relative to robot base frame
        """
        arm = self.default_arm if arm == "default" else arm
        base_link_quat = self.get_position_orientation()[1]
        return T.quat2mat(base_link_quat).T @ self.eef_links[arm].get_linear_velocity()

    def get_relative_eef_ang_vel(self, arm="default"):
        """
        Args:
            arm (str): specific arm to grab relative eef angular velocity.
                Default is "default" which corresponds to the first entry in self.arm_names

        Returns:
            3-array: (ax,ay,az) angular velocity of end-effector relative to robot base frame
        """
        arm = self.default_arm if arm == "default" else arm
        base_link_quat = self.get_position_orientation()[1]
        return T.quat2mat(base_link_quat).T @ self.eef_links[arm].get_angular_velocity()

    def _calculate_in_hand_object_rigid(self, arm="default"):
        """
        Calculates which object to assisted-grasp for arm @arm. Returns an (object_id, link_id) tuple or None
        if no valid AG-enabled object can be found.

        Args:
            arm (str): specific arm to calculate in-hand object for.
                Default is "default" which corresponds to the first entry in self.arm_names

        Returns:
            None or 2-tuple: If a valid assisted-grasp object is found, returns the corresponding
                (object, object_link) (i.e.: (BaseObject, RigidPrim)) pair to the contacted in-hand object.
                Otherwise, returns None
        """
        arm = self.default_arm if arm == "default" else arm

        # If we're not using physical grasping, we check for gripper contact
        if self.grasping_mode != "physical":
            candidates_set, robot_contact_links = self._find_gripper_contacts(arm=arm)
            # If we're using assisted grasping, we further filter candidates via ray-casting
            if self.grasping_mode == "assisted":
                candidates_set_raycast = self._find_gripper_raycast_collisions(arm=arm)
                candidates_set = candidates_set.intersection(candidates_set_raycast)
        else:
            raise ValueError("Invalid grasping mode for calculating in hand object: {}".format(self.grasping_mode))

        # Immediately return if there are no valid candidates
        if len(candidates_set) == 0:
            return None

        # Find the closest object to the gripper center
        gripper_center_pos = self.eef_links[arm].get_position_orientation()[0]

        candidate_data = []
        for prim_path in candidates_set:
            # Calculate position of the object link. Only allow this for objects currently.
            obj_prim_path, link_name = prim_path.rsplit("/", 1)
            candidate_obj = self.scene.object_registry("prim_path", obj_prim_path, None)
            if candidate_obj is None or link_name not in candidate_obj.links:
                continue
            candidate_link = candidate_obj.links[link_name]
            dist = th.norm(candidate_link.get_position_orientation()[0] - gripper_center_pos)
            candidate_data.append((prim_path, dist))

        if not candidate_data:
            return None

        candidate_data = sorted(candidate_data, key=lambda x: x[-1])
        ag_prim_path, _ = candidate_data[0]

        # Make sure the ag_prim_path is not a self collision
        assert ag_prim_path not in self.link_prim_paths, "assisted grasp object cannot be the robot itself!"

        # Make sure at least two fingers are in contact with this object
        robot_contacts = robot_contact_links[ag_prim_path]
        touching_at_least_two_fingers = (
            True
            if self.grasping_mode == "sticky"
            else len({link.prim_path for link in self.finger_links[arm]}.intersection(robot_contacts)) >= 2
        )

        # TODO: Better heuristic, hacky, we assume the parent object prim path is the prim_path minus the last "/" item
        ag_obj_prim_path = "/".join(ag_prim_path.split("/")[:-1])
        ag_obj_link_name = ag_prim_path.split("/")[-1]
        ag_obj = self.scene.object_registry("prim_path", ag_obj_prim_path)

        # Return None if object cannot be assisted grasped or not touching at least two fingers
        if ag_obj is None or not touching_at_least_two_fingers:
            return None

        # Get object and its contacted link
        return ag_obj, ag_obj.links[ag_obj_link_name]

    def _find_gripper_raycast_collisions(self, arm="default"):
        """
        For arm @arm, calculate any prims that are not part of the robot
        itself that intersect with rays cast between any of the gripper's start and end points

        Args:
            arm (str): specific arm whose gripper will be checked for raycast collisions. Default is "default"
            which corresponds to the first entry in self.arm_names

        Returns:
            set[str]: set of prim path of detected raycast intersections that
            are not the robot itself. Note: if no objects that are not the robot itself are intersecting,
            the set will be empty.
        """
        arm = self.default_arm if arm == "default" else arm
        # First, make sure start and end grasp points exist (i.e.: aren't None)
        assert (
            self.assisted_grasp_start_points[arm] is not None
        ), "In order to use assisted grasping, assisted_grasp_start_points must not be None!"
        assert (
            self.assisted_grasp_end_points[arm] is not None
        ), "In order to use assisted grasping, assisted_grasp_end_points must not be None!"

        # Iterate over all start and end grasp points and calculate their x,y,z positions in the world frame
        # (per arm appendage)
        # Since we'll be calculating the cartesian cross product between start and end points, we stack the start points
        # by the number of end points and repeat the individual elements of the end points by the number of start points
        startpoints = []
        endpoints = []
        for grasp_start_point in self.assisted_grasp_start_points[arm]:
            # Get world coordinates of link base frame
            link_pos, link_orn = self.links[grasp_start_point.link_name].get_position_orientation()
            # Calculate grasp start point in world frame and add to startpoints
            start_point, _ = T.pose_transform(
                link_pos, link_orn, grasp_start_point.position, th.tensor([0, 0, 0, 1], dtype=th.float32)
            )
            startpoints.append(start_point)
        # Repeat for end points
        for grasp_end_point in self.assisted_grasp_end_points[arm]:
            # Get world coordinates of link base frame
            link_pos, link_orn = self.links[grasp_end_point.link_name].get_position_orientation()
            # Calculate grasp start point in world frame and add to endpoints
            end_point, _ = T.pose_transform(
                link_pos, link_orn, grasp_end_point.position, th.tensor([0, 0, 0, 1], dtype=th.float32)
            )
            endpoints.append(end_point)
        # Stack the start points and repeat the end points, and add these values to the raycast dicts
        n_startpoints, n_endpoints = len(startpoints), len(endpoints)
        raycast_startpoints = startpoints * n_endpoints
        raycast_endpoints = []
        for endpoint in endpoints:
            raycast_endpoints += [endpoint] * n_startpoints
        ray_data = set()
        # Calculate raycasts from each start point to end point -- this is n_startpoints * n_endpoints total rays
        for result in raytest_batch(raycast_startpoints, raycast_endpoints, only_closest=True):
            if result["hit"]:
                # filter out self body parts (we currently assume that the robot cannot grasp itself)
                if self.prim_path not in result["rigidBody"]:
                    ray_data.add(result["rigidBody"])
        return ray_data

    def _handle_release_window(self, arm="default"):
        """
        Handles releasing an object from arm @arm

        Args:
            arm (str): specific arm to handle release window.
                Default is "default" which corresponds to the first entry in self.arm_names
        """
        arm = self.default_arm if arm == "default" else arm
        self._ag_release_counter[arm] += 1
        time_since_release = self._ag_release_counter[arm] * og.sim.get_sim_step_dt()
        if time_since_release >= m.RELEASE_WINDOW:
            self._ag_obj_in_hand[arm] = None
            self._ag_release_counter[arm] = None

    def _freeze_gripper(self, arm="default"):
        """
        Freezes gripper finger joints - used in assisted grasping.

        Args:
            arm (str): specific arm to freeze gripper.
                Default is "default" which corresponds to the first entry in self.arm_names
        """
        arm = self.default_arm if arm == "default" else arm
        for joint_name, j_val in self._ag_freeze_joint_pos[arm].items():
            joint = self._joints[joint_name]
            joint.set_pos(pos=j_val)
            joint.set_vel(vel=0.0)

    @property
    def robot_arm_descriptor_yamls(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to files path to the descriptor
                of the robot for IK Controller.
        """
        raise NotImplementedError

    @property
    def _default_arm_joint_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default controller config to control that
                robot's arm. Uses velocity control by default.
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "JointController",
                "control_freq": self._control_freq,
                "control_limits": self.control_limits,
                "dof_idx": self.arm_control_idx[arm],
                "command_output_limits": None,
                "motor_type": "position",
                "use_delta_commands": True,
                "use_impedances": True,
            }
        return dic

    @property
    def _default_arm_ik_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default controller config for an
                Inverse kinematics controller to control this robot's arm
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "InverseKinematicsController",
                "task_name": f"eef_{arm}",
                "robot_description_path": self.robot_arm_descriptor_yamls[arm],
                "robot_urdf_path": self.urdf_path,
                "eef_name": self.eef_link_names[arm],
                "control_freq": self._control_freq,
                "reset_joint_pos": self.reset_joint_pos,
                "control_limits": self.control_limits,
                "dof_idx": self.arm_control_idx[arm],
                "command_output_limits": (
                    th.tensor([-0.2, -0.2, -0.2, -0.5, -0.5, -0.5]),
                    th.tensor([0.2, 0.2, 0.2, 0.5, 0.5, 0.5]),
                ),
                "mode": "pose_delta_ori",
                "smoothing_filter_size": 2,
                "workspace_pose_limiter": None,
            }
        return dic

    @property
    def _default_arm_osc_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default controller config for an
                operational space controller to control this robot's arm
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "OperationalSpaceController",
                "task_name": f"eef_{arm}",
                "control_freq": self._control_freq,
                "reset_joint_pos": self.reset_joint_pos,
                "control_limits": self.control_limits,
                "dof_idx": self.arm_control_idx[arm],
                "command_output_limits": (
                    th.tensor([-0.2, -0.2, -0.2, -0.5, -0.5, -0.5]),
                    th.tensor([0.2, 0.2, 0.2, 0.5, 0.5, 0.5]),
                ),
                "mode": "pose_delta_ori",
                "workspace_pose_limiter": None,
            }
        return dic

    @property
    def _default_arm_null_joint_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default arm null controller config
                to control this robot's arm i.e. dummy controller
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "NullJointController",
                "control_freq": self._control_freq,
                "motor_type": "position",
                "control_limits": self.control_limits,
                "dof_idx": self.arm_control_idx[arm],
                "default_command": self.reset_joint_pos[self.arm_control_idx[arm]],
                "use_impedances": False,
            }
        return dic

    @property
    def _default_gripper_multi_finger_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default controller config to control
                this robot's multi finger gripper. Assumes robot gripper idx has exactly two elements
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "MultiFingerGripperController",
                "control_freq": self._control_freq,
                "motor_type": "position",
                "control_limits": self.control_limits,
                "dof_idx": self.gripper_control_idx[arm],
                "command_output_limits": "default",
                "mode": "binary",
                "limit_tolerance": 0.001,
                "inverted": self._grasping_direction == "upper",
            }
        return dic

    @property
    def _default_gripper_joint_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default gripper joint controller config
                to control this robot's gripper
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "JointController",
                "control_freq": self._control_freq,
                "motor_type": "velocity",
                "control_limits": self.control_limits,
                "dof_idx": self.gripper_control_idx[arm],
                "command_output_limits": "default",
                "use_delta_commands": False,
                "use_impedances": False,
            }
        return dic

    @property
    def _default_gripper_null_controller_configs(self):
        """
        Returns:
            dict: Dictionary mapping arm appendage name to default gripper null controller config
                to control this robot's (non-prehensile) gripper i.e. dummy controller
        """
        dic = {}
        for arm in self.arm_names:
            dic[arm] = {
                "name": "NullJointController",
                "control_freq": self._control_freq,
                "motor_type": "velocity",
                "control_limits": self.control_limits,
                "dof_idx": self.gripper_control_idx[arm],
                "default_command": th.zeros(len(self.gripper_control_idx[arm])),
                "use_impedances": False,
            }
        return dic

    @property
    def _default_controller_config(self):
        # Always run super method first
        cfg = super()._default_controller_config

        arm_ik_configs = self._default_arm_ik_controller_configs
        arm_osc_configs = self._default_arm_osc_controller_configs
        arm_joint_configs = self._default_arm_joint_controller_configs
        arm_null_joint_configs = self._default_arm_null_joint_controller_configs
        gripper_pj_configs = self._default_gripper_multi_finger_controller_configs
        gripper_joint_configs = self._default_gripper_joint_controller_configs
        gripper_null_configs = self._default_gripper_null_controller_configs

        # Add arm and gripper defaults, per arm
        for arm in self.arm_names:
            cfg["arm_{}".format(arm)] = {
                arm_ik_configs[arm]["name"]: arm_ik_configs[arm],
                arm_osc_configs[arm]["name"]: arm_osc_configs[arm],
                arm_joint_configs[arm]["name"]: arm_joint_configs[arm],
                arm_null_joint_configs[arm]["name"]: arm_null_joint_configs[arm],
            }
            cfg["gripper_{}".format(arm)] = {
                gripper_pj_configs[arm]["name"]: gripper_pj_configs[arm],
                gripper_joint_configs[arm]["name"]: gripper_joint_configs[arm],
                gripper_null_configs[arm]["name"]: gripper_null_configs[arm],
            }

        return cfg

    def _get_assisted_grasp_joint_type(self, ag_obj, ag_link):
        """
        Check whether an object @obj can be grasped. If so, return the joint type to use for assisted grasping.
        Otherwise, return None.

        Args:
            ag_obj (BaseObject): Object targeted for an assisted grasp
            ag_link (RigidPrim): Link of the object to be grasped

        Returns:
            (None or str): If obj can be grasped, returns the joint type to use for assisted grasping.
        """

        # Deny objects that are too heavy and are not a non-base link of a fixed-base object)
        mass = ag_link.mass
        if mass > m.ASSIST_GRASP_MASS_THRESHOLD and not (ag_obj.fixed_base and ag_link != ag_obj.root_link):
            return None

        # Otherwise, compute the joint type. We use a fixed joint unless the link is a non-fixed link.
        # A link is non-fixed if it has any non-fixed parent joints.
        joint_type = "FixedJoint"
        for edge in nx.edge_dfs(ag_obj.articulation_tree, ag_link.body_name, orientation="reverse"):
            joint = ag_obj.articulation_tree.edges[edge[:2]]
            if joint["joint_type"] != JointType.JOINT_FIXED:
                joint_type = "SphericalJoint"
                break

        return joint_type

    def _establish_grasp_rigid(self, arm="default", ag_data=None, contact_pos=None):
        """
        Establishes an ag-assisted grasp, if enabled.

        Args:
            arm (str): specific arm to establish grasp.
                Default is "default" which corresponds to the first entry in self.arm_names
            ag_data (None or 2-tuple): if specified, assisted-grasp object, link tuple (i.e. :(BaseObject, RigidPrim)).
                Otherwise, does a no-op
            contact_pos (None or th.tensor): if specified, contact position to use for grasp.
        """
        arm = self.default_arm if arm == "default" else arm

        # Return immediately if ag_data is None
        if ag_data is None:
            return
        ag_obj, ag_link = ag_data
        # Get the appropriate joint type
        joint_type = self._get_assisted_grasp_joint_type(ag_obj, ag_link)
        if joint_type is None:
            return

        if contact_pos is None:
            force_data, _ = self._find_gripper_contacts(arm=arm, return_contact_positions=True)
            for c_link_prim_path, c_contact_pos in force_data:
                if c_link_prim_path == ag_link.prim_path:
                    contact_pos = c_contact_pos
                    break

        assert contact_pos is not None, (
            f"contact_pos in self._find_gripper_contacts(return_contact_positions=True) is not found in "
            f"self._find_gripper_contacts(return_contact_positions=False). This is likely because "
            f"GripperRigidContactAPI.get_contact_pairs and get_contact_data return inconsistent results."
        )

        # Joint frame set at the contact point
        # Need to find distance between robot and contact point in robot link's local frame and
        # ag link and contact point in ag link's local frame
        joint_frame_pos = contact_pos
        joint_frame_orn = th.tensor([0, 0, 0, 1.0])
        eef_link_pos, eef_link_orn = self.eef_links[arm].get_position_orientation()
        parent_frame_pos, parent_frame_orn = T.relative_pose_transform(
            joint_frame_pos, joint_frame_orn, eef_link_pos, eef_link_orn
        )
        obj_link_pos, obj_link_orn = ag_link.get_position_orientation()
        child_frame_pos, child_frame_orn = T.relative_pose_transform(
            joint_frame_pos, joint_frame_orn, obj_link_pos, obj_link_orn
        )

        # Create the joint
        joint_prim_path = f"{self.eef_links[arm].prim_path}/ag_constraint"
        joint_prim = create_joint(
            prim_path=joint_prim_path,
            joint_type=joint_type,
            body0=self.eef_links[arm].prim_path,
            body1=ag_link.prim_path,
            enabled=True,
            exclude_from_articulation=True,
            joint_frame_in_parent_frame_pos=parent_frame_pos / self.scale,
            joint_frame_in_parent_frame_quat=parent_frame_orn,
            joint_frame_in_child_frame_pos=child_frame_pos / ag_obj.scale,
            joint_frame_in_child_frame_quat=child_frame_orn,
        )

        # Save a reference to this joint prim
        self._ag_obj_constraints[arm] = joint_prim

        # Modify max force based on user-determined assist parameters
        # TODO
        assist_force = m.MIN_ASSIST_FORCE + (m.MAX_ASSIST_FORCE - m.MIN_ASSIST_FORCE) * m.ASSIST_FRACTION
        max_force = assist_force if joint_type == "FixedJoint" else assist_force * m.ARTICULATED_ASSIST_FRACTION
        # joint_prim.GetAttribute("physics:breakForce").Set(max_force)

        self._ag_obj_constraint_params[arm] = {
            "ag_obj_prim_path": ag_obj.prim_path,
            "ag_link_prim_path": ag_link.prim_path,
            "ag_joint_prim_path": joint_prim_path,
            "joint_type": joint_type,
            "gripper_pos": self.get_joint_positions()[self.gripper_control_idx[arm]],
            "max_force": max_force,
            "contact_pos": contact_pos,
        }
        self._ag_obj_in_hand[arm] = ag_obj
        self._ag_freeze_gripper[arm] = True
        for joint in self.finger_joints[arm]:
            j_val = joint.get_state()[0][0]
            self._ag_freeze_joint_pos[arm][joint.joint_name] = j_val

    def _handle_assisted_grasping(self):
        """
        Handles assisted grasping by creating or removing constraints.
        """
        # Loop over all arms
        for arm in self.arm_names:
            # We apply a threshold based on the control rather than the command here so that the behavior
            # stays the same across different controllers and control modes (absolute / delta). This way,
            # a zero action will actually keep the AG setting where it already is.
            controller = self._controllers[f"gripper_{arm}"]
            controlled_joints = controller.dof_idx
            threshold = th.mean(
                th.stack([self.joint_lower_limits[controlled_joints], self.joint_upper_limits[controlled_joints]]),
                dim=0,
            )
            if controller.control is None:
                applying_grasp = False
            elif self._grasping_direction == "lower":
                applying_grasp = th.any(controller.control < threshold)
            else:
                applying_grasp = th.any(controller.control > threshold)
            # Execute gradual release of object
            if self._ag_obj_in_hand[arm]:
                if self._ag_release_counter[arm] is not None:
                    self._handle_release_window(arm=arm)
                else:
                    if gm.AG_CLOTH:
                        self._update_constraint_cloth(arm=arm)

                    if not applying_grasp:
                        self._release_grasp(arm=arm)
            elif applying_grasp:
                self._establish_grasp(arm=arm, ag_data=self._calculate_in_hand_object(arm=arm))

    def _update_constraint_cloth(self, arm="default"):
        """
        Update the AG constraint for cloth: for the fixed joint between the attachment point and the world, we set
        the local pos to match the current eef link position plus the attachment_point_pos_local offset. As a result,
        the joint will drive the attachment point to the updated position, which will then drive the cloth.
        See _establish_grasp_cloth for more details.

        Args:
            arm (str): specific arm to establish grasp.
                Default is "default" which corresponds to the first entry in self.arm_names
        """
        attachment_point_pos_local = self._ag_obj_constraint_params[arm]["attachment_point_pos_local"]
        eef_link_pos, eef_link_orn = self.eef_links[arm].get_position_orientation()
        attachment_point_pos, _ = T.pose_transform(
            eef_link_pos, eef_link_orn, attachment_point_pos_local, th.tensor([0, 0, 0, 1], dtype=th.float32)
        )
        joint_prim = self._ag_obj_constraints[arm]
        joint_prim.GetAttribute("physics:localPos1").Set(lazy.pxr.Gf.Vec3f(*attachment_point_pos))

    def _calculate_in_hand_object(self, arm="default"):
        if gm.AG_CLOTH:
            return self._calculate_in_hand_object_cloth(arm)
        else:
            return self._calculate_in_hand_object_rigid(arm)

    def _establish_grasp(self, arm="default", ag_data=None, contact_pos=None):
        if gm.AG_CLOTH:
            return self._establish_grasp_cloth(arm, ag_data)
        else:
            return self._establish_grasp_rigid(arm, ag_data, contact_pos)

    def _calculate_in_hand_object_cloth(self, arm="default"):
        """
        Same as _calculate_in_hand_object_rigid, except for cloth. Only one should be used at any given time.

        Calculates which object to assisted-grasp for arm @arm. Returns an (BaseObject, RigidPrim, th.Tensor) tuple or
        None if no valid AG-enabled object can be found.

        1) Check if the gripper is closed enough
        2) Go through each of the cloth object, and check if its attachment point link position is within the "ghost"
        box volume of the gripper link.

        Only returns the first valid object and ignore the rest.

        Args:
            arm (str): specific arm to establish grasp.
                Default is "default" which corresponds to the first entry in self.arm_names

        Returns:
            None or 3-tuple: If a valid assisted-grasp object is found,
                returns the corresponding (object, object_link, attachment_point_position), i.e.
                ((BaseObject, RigidPrim, th.Tensor)) to the contacted in-hand object. Otherwise, returns None
        """
        # TODO (eric): Assume joint_pos = 0 means fully closed
        GRIPPER_FINGER_CLOSE_THRESHOLD = 0.03
        gripper_finger_pos = self.get_joint_positions()[self.gripper_control_idx[arm]]
        gripper_finger_close = th.sum(gripper_finger_pos) < GRIPPER_FINGER_CLOSE_THRESHOLD
        if not gripper_finger_close:
            return None

        cloth_objs = self.scene.object_registry("prim_type", PrimType.CLOTH)
        if cloth_objs is None:
            return None

        # TODO (eric): Only AG one cloth at any given moment.
        # Returns the first cloth that overlaps with the "ghost" box volume
        for cloth_obj in cloth_objs:
            attachment_point_pos = cloth_obj.links["attachment_point"].get_position_orientation()[0]
            particles_in_volume = self._ag_check_in_volume[arm]([attachment_point_pos])
            if particles_in_volume.sum() > 0:
                return cloth_obj, cloth_obj.links["attachment_point"], attachment_point_pos

        return None

    def _establish_grasp_cloth(self, arm="default", ag_data=None):
        """
        Same as _establish_grasp_cloth, except for cloth. Only one should be used at any given time.
        Establishes an ag-assisted grasp, if enabled.

        Create a fixed joint between the attachment point link of the cloth object and the world.
        In theory, we could have created a fixed joint to the eef link, but omni doesn't support this as the robot has
        an articulation root API attached to it, which is incompatible with the attachment API.

        We also store attachment_point_pos_local as the attachment point position in the eef link frame when the fixed
        joint is created. As the eef link frame changes its pose, we will use attachment_point_pos_local to figure out
        the new attachment_point_pos in the world frame and set the fixed joint to there. See _update_constraint_cloth
        for more details.

        Args:
            arm (str): specific arm to establish grasp.
                Default is "default" which corresponds to the first entry in self.arm_names
            ag_data (None or 3-tuple): If specified, should be the corresponding
                (object, object_link, attachment_point_position), i.e. ((BaseObject, RigidPrim, th.Tensor)) to the]
                contacted in-hand object
        """
        arm = self.default_arm if arm == "default" else arm

        # Return immediately if ag_data is None
        if ag_data is None:
            return

        ag_obj, ag_link, attachment_point_pos = ag_data

        # Find the attachment point position in the eef frame
        eef_link_pos, eef_link_orn = self.eef_links[arm].get_position_orientation()
        attachment_point_pos_local, _ = T.relative_pose_transform(
            attachment_point_pos, th.tensor([0, 0, 0, 1], dtype=th.float32), eef_link_pos, eef_link_orn
        )

        # Create the joint
        joint_prim_path = f"{ag_link.prim_path}/ag_constraint"
        joint_type = "FixedJoint"
        joint_prim = create_joint(
            prim_path=joint_prim_path,
            joint_type=joint_type,
            body0=ag_link.prim_path,
            body1=None,
            enabled=False,
            exclude_from_articulation=True,
            joint_frame_in_child_frame_pos=attachment_point_pos,
        )

        # Save a reference to this joint prim
        self._ag_obj_constraints[arm] = joint_prim

        # Modify max force based on user-determined assist parameters
        # TODO
        max_force = m.MIN_ASSIST_FORCE + (m.MAX_ASSIST_FORCE - m.MIN_ASSIST_FORCE) * m.ASSIST_FRACTION
        # joint_prim.GetAttribute("physics:breakForce").Set(max_force)

        self._ag_obj_constraint_params[arm] = {
            "ag_obj_prim_path": ag_obj.prim_path,
            "ag_link_prim_path": ag_link.prim_path,
            "ag_joint_prim_path": joint_prim_path,
            "joint_type": joint_type,
            "gripper_pos": self.get_joint_positions()[self.gripper_control_idx[arm]],
            "max_force": max_force,
            "attachment_point_pos_local": attachment_point_pos_local,
            "contact_pos": attachment_point_pos,
        }
        self._ag_obj_in_hand[arm] = ag_obj
        self._ag_freeze_gripper[arm] = True
        for joint in self.finger_joints[arm]:
            j_val = joint.get_state()[0][0]
            self._ag_freeze_joint_pos[arm][joint.joint_name] = j_val

    def _dump_state(self):
        # Call super first
        state = super()._dump_state()

        # If we're using actual physical grasping, no extra state needed to save
        if self.grasping_mode == "physical":
            return state

        # Include AG_state
        ag_params = self._ag_obj_constraint_params.copy()
        for arm in ag_params.keys():
            if len(ag_params[arm]) > 0:
                assert self.scene is not None, "Cannot get position and orientation relative to scene without a scene"
                ag_params[arm]["contact_pos"], _ = self.scene.convert_world_pose_to_scene_relative(
                    ag_params[arm]["contact_pos"],
                    th.tensor([0, 0, 0, 1], dtype=th.float32),
                )
        state["ag_obj_constraint_params"] = ag_params
        return state

    def _load_state(self, state):
        # If there is an existing AG object, remove it
        self.release_grasp_immediately()

        super()._load_state(state=state)

        # No additional loading needed if we're using physical grasping
        if self.grasping_mode == "physical":
            return

        # Include AG_state
        # TODO: currently does not take care of cloth objects
        # TODO: add unit tests
        for arm in state["ag_obj_constraint_params"].keys():
            if len(state["ag_obj_constraint_params"][arm]) > 0:
                data = state["ag_obj_constraint_params"][arm]
                obj = self.scene.object_registry("prim_path", data["ag_obj_prim_path"])
                link = obj.links[data["ag_link_prim_path"].split("/")[-1]]
                contact_pos_global = data["contact_pos"]
                assert self.scene is not None, "Cannot set position and orientation relative to scene without a scene"
                contact_pos_global, _ = self.scene.convert_scene_relative_pose_to_world(
                    contact_pos_global,
                    th.tensor([0, 0, 0, 1], dtype=th.float32),
                )
                self._establish_grasp(arm=arm, ag_data=(obj, link), contact_pos=contact_pos_global)

    def serialize(self, state):
        # Call super first
        state_flat = super().serialize(state=state)

        # No additional serialization needed if we're using physical grasping
        if self.grasping_mode == "physical":
            return state_flat

        # TODO AG
        return state_flat

    def deserialize(self, state):
        # Call super first
        state_dict, idx = super().deserialize(state=state)

        # No additional deserialization needed if we're using physical grasping
        if self.grasping_mode == "physical":
            return state_dict, idx

        # TODO AG
        return state_dict, idx

    @classproperty
    def _do_not_register_classes(cls):
        # Don't register this class since it's an abstract template
        classes = super()._do_not_register_classes
        classes.add("ManipulationRobot")
        return classes

    @property
    def eef_usd_path(self):
        """
        Returns:
            dict(str, str): dict mapping arm name to the path to the eef usd file
        """
        raise NotImplementedError

    @property
    def teleop_rotation_offset(self):
        """
        Rotational offset that will be applied for teleoperation
        such that [0, 0, 0, 1] as action will keep the robot eef pointing at +x axis
        """
        return {arm: th.tensor([0, 0, 0, 1]) for arm in self.arm_names}

    def teleop_data_to_action(self, teleop_action) -> th.Tensor:
        """
        Generate action data from teleoperation action data
        NOTE: This implementation only supports IK/OSC controller for arm and MultiFingerGripperController for gripper.
        Overwrite this function if the robot is using a different controller.
        Args:
            teleop_action (TeleopAction): teleoperation action data
        Returns:
            th.tensor: array of action data for arm and gripper
        """
        action = super().teleop_data_to_action(teleop_action)
        hands = ["left", "right"] if self.n_arms == 2 else ["right"]
        for i, hand in enumerate(hands):
            arm_name = self.arm_names[i]
            arm_action = th.tensor(teleop_action[hand]).float()
            # arm action
            assert isinstance(self._controllers[f"arm_{arm_name}"], InverseKinematicsController) or isinstance(
                self._controllers[f"arm_{arm_name}"], OperationalSpaceController
            ), f"Only IK and OSC controllers are supported for arm {arm_name}!"
            target_pos, target_orn = arm_action[:3], T.quat2axisangle(T.euler2quat(arm_action[3:6]))
            action[self.arm_action_idx[arm_name]] = th.cat((target_pos, target_orn))
            # gripper action
            assert isinstance(
                self._controllers[f"gripper_{arm_name}"], MultiFingerGripperController
            ), f"Only MultiFingerGripperController is supported for gripper {arm_name}!"
            action[self.gripper_action_idx[arm_name]] = arm_action[6]
        return action

arm_control_idx property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to indices in low-level control vector corresponding to arm joints.

arm_joint_names abstractmethod property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to corresponding arm joint names, should correspond to specific joint names in this robot's underlying model file

Note: the ordering within the dictionary is assumed to be intentional, and is directly used to define the set of corresponding control idxs.

Returns:

Type Description
dict

Dictionary mapping arm appendage name to corresponding arm link names, should correspond to specific link names in this robot's underlying model file

Note: the ordering within the dictionary is assumed to be intentional, and is directly used to define the set of corresponding idxs.

Returns:

Type Description
dict

Dictionary mapping arm appendage name to robot links corresponding to that arm's links

arm_workspace_range property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to a tuple indicating the start and end of the angular range of the arm workspace around the Z axis of the robot, where 0 is facing forward.

assisted_grasp_end_points property

Returns:

Type Description
dict

Dictionary mapping individual arm appendage names to array of GraspingPoint tuples, composed of (link_name, position) values specifying valid grasping end points located at cartesian (x,y,z) coordinates specified in link_name's local coordinate frame. These values will be used in conjunction with @self.assisted_grasp_start_points to trigger assisted grasps, where objects that intersect with any ray starting at any point in @self.assisted_grasp_start_points and terminating at any point in @self.assisted_grasp_end_points will trigger an assisted grasp (calculated individually for each gripper appendage). By default, each entry returns None, and must be implemented by any robot subclass that wishes to use assisted grasping.

assisted_grasp_start_points property

Returns:

Type Description
dict

Dictionary mapping individual arm appendage names to array of GraspingPoint tuples, composed of (link_name, position) values specifying valid grasping start points located at cartesian (x,y,z) coordinates specified in link_name's local coordinate frame. These values will be used in conjunction with @self.assisted_grasp_end_points to trigger assisted grasps, where objects that intersect with any ray starting at any point in @self.assisted_grasp_start_points and terminating at any point in @self.assisted_grasp_end_points will trigger an assisted grasp (calculated individually for each gripper appendage). By default, each entry returns None, and must be implemented by any robot subclass that wishes to use assisted grasping.

default_arm property

Returns:

Type Description
str

Default arm name for this robot, corresponds to the first entry in @arm_names by default

Returns:

Type Description
dict

Dictionary mapping arm appendage name to corresponding name of the EEF link, should correspond to specific link name in this robot's underlying model file

Returns:

Type Description
dict

Dictionary mapping arm appendage name to robot link corresponding to that arm's eef link

eef_usd_path property

Returns:

Type Description
dict(str, str

dict mapping arm name to the path to the eef usd file

finger_joint_names abstractmethod property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to array of joint names corresponding to this robot's fingers.

Note: the ordering within the dictionary is assumed to be intentional, and is directly used to define the set of corresponding control idxs.

finger_joints property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to robot joints corresponding to that arm's finger joints

finger_lengths property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to corresponding length of the fingers in that hand defined from the palm (assuming all fingers in one hand are equally long)

Returns:

Type Description
dict

Dictionary mapping arm appendage name to array of link names corresponding to this robot's fingers

Note: the ordering within the dictionary is assumed to be intentional, and is directly used to define the set of corresponding idxs.

Returns:

Type Description
dict

Dictionary mapping arm appendage name to robot links corresponding to that arm's finger links

grasping_mode property

Grasping mode of this robot. Is one of AG_MODES

Returns:

Type Description
str

Grasping mode for this robot

gripper_control_idx property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to indices in low-level control vector corresponding to gripper joints.

robot_arm_descriptor_yamls property

Returns:

Type Description
dict

Dictionary mapping arm appendage name to files path to the descriptor of the robot for IK Controller.

teleop_rotation_offset property

Rotational offset that will be applied for teleoperation such that [0, 0, 0, 1] as action will keep the robot eef pointing at +x axis

__init__(name, relative_prim_path=None, scale=None, visible=True, fixed_base=False, visual_only=False, self_collisions=False, load_config=None, abilities=None, control_freq=None, controller_config=None, action_type='continuous', action_normalize=True, reset_joint_pos=None, obs_modalities=('rgb', 'proprio'), proprio_obs='default', sensor_config=None, grasping_mode='physical', grasping_direction='lower', disable_grasp_handling=False, **kwargs)

Parameters:

Name Type Description Default
name str

Name for the object. Names need to be unique per scene

required
relative_prim_path str

Scene-local prim path of the Prim to encapsulate or create.

None
scale None or float or 3 - array

if specified, sets either the uniform (float) or x,y,z (3-array) scale for this object. A single number corresponds to uniform scaling along the x,y,z axes, whereas a 3-array specifies per-axis scaling.

None
visible bool

whether to render this object or not in the stage

True
fixed_base bool

whether to fix the base of this object or not

False
visual_only bool

Whether this object should be visual only (and not collide with any other objects)

False
self_collisions bool

Whether to enable self collisions for this object

False
load_config None or dict

If specified, should contain keyword-mapped values that are relevant for loading this prim at runtime.

None
abilities None or dict

If specified, manually adds specific object states to this object. It should be a dict in the form of {ability: {param: value}} containing object abilities and parameters to pass to the object state instance constructor.

None
control_freq float

control frequency (in Hz) at which to control the object. If set to be None, we will automatically set the control frequency to be at the render frequency by default.

None
controller_config None or dict

nested dictionary mapping controller name(s) to specific controller configurations for this object. This will override any default values specified by this class.

None
action_type str

one of {discrete, continuous} - what type of action space to use

'continuous'
action_normalize bool

whether to normalize inputted actions. This will override any default values specified by this class.

True
reset_joint_pos None or n - array

if specified, should be the joint positions that the object should be set to during a reset. If None (default), self._default_joint_pos will be used instead.

None
obs_modalities str or list of str

Observation modalities to use for this robot. Default is "all", which corresponds to all modalities being used. Otherwise, valid options should be part of omnigibson.sensors.ALL_SENSOR_MODALITIES. Note: If @sensor_config explicitly specifies modalities for a given sensor class, it will override any values specified from @obs_modalities!

('rgb', 'proprio')
proprio_obs str or list of str

proprioception observation key(s) to use for generating proprioceptive observations. If str, should be exactly "default" -- this results in the default proprioception observations being used, as defined by self.default_proprio_obs. See self._get_proprioception_dict for valid key choices

'default'
sensor_config None or dict

nested dictionary mapping sensor class name(s) to specific sensor configurations for this object. This will override any default values specified by this class.

None
grasping_mode str

One of {"physical", "assisted", "sticky"}. If "physical", no assistive grasping will be applied (relies on contact friction + finger force). If "assisted", will magnetize any object touching and within the gripper's fingers. In this mode, at least two "fingers" need to touch the object. If "sticky", will magnetize any object touching the gripper's fingers. In this mode, only one finger needs to touch the object.

'physical'
grasping_direction str

One of {"lower", "upper"}. If "lower", lower limit represents a closed grasp, otherwise upper limit represents a closed grasp.

'lower'
disable_grasp_handling bool

If True, the robot will not automatically handle assisted or sticky grasps. Instead, you will need to call the grasp handling methods yourself.

False
kwargs dict

Additional keyword arguments that are used for other super() calls from subclasses, allowing for flexible compositions of various object subclasses (e.g.: Robot is USDObject + ControllableObject).

{}
Source code in omnigibson/robots/manipulation_robot.py
def __init__(
    self,
    # Shared kwargs in hierarchy
    name,
    relative_prim_path=None,
    scale=None,
    visible=True,
    fixed_base=False,
    visual_only=False,
    self_collisions=False,
    load_config=None,
    # Unique to USDObject hierarchy
    abilities=None,
    # Unique to ControllableObject hierarchy
    control_freq=None,
    controller_config=None,
    action_type="continuous",
    action_normalize=True,
    reset_joint_pos=None,
    # Unique to BaseRobot
    obs_modalities=("rgb", "proprio"),
    proprio_obs="default",
    sensor_config=None,
    # Unique to ManipulationRobot
    grasping_mode="physical",
    grasping_direction="lower",
    disable_grasp_handling=False,
    **kwargs,
):
    """
    Args:
        name (str): Name for the object. Names need to be unique per scene
        relative_prim_path (str): Scene-local prim path of the Prim to encapsulate or create.
        scale (None or float or 3-array): if specified, sets either the uniform (float) or x,y,z (3-array) scale
            for this object. A single number corresponds to uniform scaling along the x,y,z axes, whereas a
            3-array specifies per-axis scaling.
        visible (bool): whether to render this object or not in the stage
        fixed_base (bool): whether to fix the base of this object or not
        visual_only (bool): Whether this object should be visual only (and not collide with any other objects)
        self_collisions (bool): Whether to enable self collisions for this object
        load_config (None or dict): If specified, should contain keyword-mapped values that are relevant for
            loading this prim at runtime.
        abilities (None or dict): If specified, manually adds specific object states to this object. It should be
            a dict in the form of {ability: {param: value}} containing object abilities and parameters to pass to
            the object state instance constructor.
        control_freq (float): control frequency (in Hz) at which to control the object. If set to be None,
            we will automatically set the control frequency to be at the render frequency by default.
        controller_config (None or dict): nested dictionary mapping controller name(s) to specific controller
            configurations for this object. This will override any default values specified by this class.
        action_type (str): one of {discrete, continuous} - what type of action space to use
        action_normalize (bool): whether to normalize inputted actions. This will override any default values
            specified by this class.
        reset_joint_pos (None or n-array): if specified, should be the joint positions that the object should
            be set to during a reset. If None (default), self._default_joint_pos will be used instead.
        obs_modalities (str or list of str): Observation modalities to use for this robot. Default is "all", which
            corresponds to all modalities being used.
            Otherwise, valid options should be part of omnigibson.sensors.ALL_SENSOR_MODALITIES.
            Note: If @sensor_config explicitly specifies `modalities` for a given sensor class, it will
                override any values specified from @obs_modalities!
        proprio_obs (str or list of str): proprioception observation key(s) to use for generating proprioceptive
            observations. If str, should be exactly "default" -- this results in the default proprioception
            observations being used, as defined by self.default_proprio_obs. See self._get_proprioception_dict
            for valid key choices
        sensor_config (None or dict): nested dictionary mapping sensor class name(s) to specific sensor
            configurations for this object. This will override any default values specified by this class.
        grasping_mode (str): One of {"physical", "assisted", "sticky"}.
            If "physical", no assistive grasping will be applied (relies on contact friction + finger force).
            If "assisted", will magnetize any object touching and within the gripper's fingers. In this mode,
                at least two "fingers" need to touch the object.
            If "sticky", will magnetize any object touching the gripper's fingers. In this mode, only one finger
                needs to touch the object.
        grasping_direction (str): One of {"lower", "upper"}. If "lower", lower limit represents a closed grasp,
            otherwise upper limit represents a closed grasp.
        disable_grasp_handling (bool): If True, the robot will not automatically handle assisted or sticky grasps.
            Instead, you will need to call the grasp handling methods yourself.
        kwargs (dict): Additional keyword arguments that are used for other super() calls from subclasses, allowing
            for flexible compositions of various object subclasses (e.g.: Robot is USDObject + ControllableObject).
    """
    # Store relevant internal vars
    assert_valid_key(key=grasping_mode, valid_keys=AG_MODES, name="grasping_mode")
    assert_valid_key(key=grasping_direction, valid_keys=["lower", "upper"], name="grasping direction")
    self._grasping_mode = grasping_mode
    self._grasping_direction = grasping_direction
    self._disable_grasp_handling = disable_grasp_handling

    # Initialize other variables used for assistive grasping
    self._ag_freeze_joint_pos = {
        arm: {} for arm in self.arm_names
    }  # Frozen positions for keeping fingers held still
    self._ag_obj_in_hand = {arm: None for arm in self.arm_names}
    self._ag_obj_constraints = {arm: None for arm in self.arm_names}
    self._ag_obj_constraint_params = {arm: {} for arm in self.arm_names}
    self._ag_freeze_gripper = {arm: None for arm in self.arm_names}
    self._ag_release_counter = {arm: None for arm in self.arm_names}
    self._ag_check_in_volume = {arm: None for arm in self.arm_names}
    self._ag_calculate_volume = {arm: None for arm in self.arm_names}

    # Call super() method
    super().__init__(
        relative_prim_path=relative_prim_path,
        name=name,
        scale=scale,
        visible=visible,
        fixed_base=fixed_base,
        visual_only=visual_only,
        self_collisions=self_collisions,
        load_config=load_config,
        abilities=abilities,
        control_freq=control_freq,
        controller_config=controller_config,
        action_type=action_type,
        action_normalize=action_normalize,
        reset_joint_pos=reset_joint_pos,
        obs_modalities=obs_modalities,
        proprio_obs=proprio_obs,
        sensor_config=sensor_config,
        **kwargs,
    )

arm_names()

Returns:

Type Description
list of str

List of arm names for this robot. Should correspond to the keys used to index into arm- and gripper-related dictionaries, e.g.: eef_link_names, finger_link_names, etc. Default is string enumeration based on @self.n_arms.

Source code in omnigibson/robots/manipulation_robot.py
@classproperty
def arm_names(cls):
    """
    Returns:
        list of str: List of arm names for this robot. Should correspond to the keys used to index into
            arm- and gripper-related dictionaries, e.g.: eef_link_names, finger_link_names, etc.
            Default is string enumeration based on @self.n_arms.
    """
    return [str(i) for i in range(cls.n_arms)]

get_eef_orientation(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab eef orientation. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
3 - array

(x,y,z,w) global quaternion orientation for this robot's end-effector corresponding to arm @arm

Source code in omnigibson/robots/manipulation_robot.py
def get_eef_orientation(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab eef orientation. Default is "default" which corresponds to the first entry
            in self.arm_names

    Returns:
        3-array: (x,y,z,w) global quaternion orientation for this robot's end-effector corresponding
            to arm @arm
    """
    arm = self.default_arm if arm == "default" else arm
    return self._links[self.eef_link_names[arm]].get_position_orientation()[1]

get_eef_position(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab eef position. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
3 - array

(x,y,z) global end-effector Cartesian position for this robot's end-effector corresponding to arm @arm

Source code in omnigibson/robots/manipulation_robot.py
def get_eef_position(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab eef position. Default is "default" which corresponds to the first entry
            in self.arm_names

    Returns:
        3-array: (x,y,z) global end-effector Cartesian position for this robot's end-effector corresponding
            to arm @arm
    """
    arm = self.default_arm if arm == "default" else arm
    return self._links[self.eef_link_names[arm]].get_position_orientation()[0]

get_relative_eef_ang_vel(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab relative eef angular velocity. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
3 - array

(ax,ay,az) angular velocity of end-effector relative to robot base frame

Source code in omnigibson/robots/manipulation_robot.py
def get_relative_eef_ang_vel(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab relative eef angular velocity.
            Default is "default" which corresponds to the first entry in self.arm_names

    Returns:
        3-array: (ax,ay,az) angular velocity of end-effector relative to robot base frame
    """
    arm = self.default_arm if arm == "default" else arm
    base_link_quat = self.get_position_orientation()[1]
    return T.quat2mat(base_link_quat).T @ self.eef_links[arm].get_angular_velocity()

get_relative_eef_lin_vel(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab relative eef linear velocity. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
3 - array

(x,y,z) Linear velocity of end-effector relative to robot base frame

Source code in omnigibson/robots/manipulation_robot.py
def get_relative_eef_lin_vel(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab relative eef linear velocity.
            Default is "default" which corresponds to the first entry in self.arm_names


    Returns:
        3-array: (x,y,z) Linear velocity of end-effector relative to robot base frame
    """
    arm = self.default_arm if arm == "default" else arm
    base_link_quat = self.get_position_orientation()[1]
    return T.quat2mat(base_link_quat).T @ self.eef_links[arm].get_linear_velocity()

get_relative_eef_orientation(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab relative eef orientation. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
4 - array

(x,y,z,w) quaternion orientation of end-effector relative to robot base frame

Source code in omnigibson/robots/manipulation_robot.py
def get_relative_eef_orientation(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab relative eef orientation.
            Default is "default" which corresponds to the first entry in self.arm_names

    Returns:
        4-array: (x,y,z,w) quaternion orientation of end-effector relative to robot base frame
    """
    arm = self.default_arm if arm == "default" else arm
    return self.get_relative_eef_pose(arm=arm)[1]

get_relative_eef_pose(arm='default', mat=False)

Parameters:

Name Type Description Default
arm str

specific arm to grab eef pose. Default is "default" which corresponds to the first entry in self.arm_names

'default'
mat bool

whether to return pose in matrix form (mat=True) or (pos, quat) tuple (mat=False)

False

Returns:

Type Description
2 - tuple or (4, 4) - array

End-effector pose, either in 4x4 homogeneous matrix form (if @mat=True) or (pos, quat) tuple (if @mat=False), corresponding to arm @arm

Source code in omnigibson/robots/manipulation_robot.py
def get_relative_eef_pose(self, arm="default", mat=False):
    """
    Args:
        arm (str): specific arm to grab eef pose. Default is "default" which corresponds to the first entry
            in self.arm_names
        mat (bool): whether to return pose in matrix form (mat=True) or (pos, quat) tuple (mat=False)

    Returns:
        2-tuple or (4, 4)-array: End-effector pose, either in 4x4 homogeneous
            matrix form (if @mat=True) or (pos, quat) tuple (if @mat=False), corresponding to arm @arm
    """
    arm = self.default_arm if arm == "default" else arm
    eef_link_pose = self.eef_links[arm].get_position_orientation()
    base_link_pose = self.get_position_orientation()
    pose = T.relative_pose_transform(*eef_link_pose, *base_link_pose)
    return T.pose2mat(pose) if mat else pose

get_relative_eef_position(arm='default')

Parameters:

Name Type Description Default
arm str

specific arm to grab relative eef pos. Default is "default" which corresponds to the first entry in self.arm_names

'default'

Returns:

Type Description
3 - array

(x,y,z) Cartesian position of end-effector relative to robot base frame

Source code in omnigibson/robots/manipulation_robot.py
def get_relative_eef_position(self, arm="default"):
    """
    Args:
        arm (str): specific arm to grab relative eef pos.
            Default is "default" which corresponds to the first entry in self.arm_names


    Returns:
        3-array: (x,y,z) Cartesian position of end-effector relative to robot base frame
    """
    arm = self.default_arm if arm == "default" else arm
    return self.get_relative_eef_pose(arm=arm)[0]

is_grasping(arm='default', candidate_obj=None)

Returns True if the robot is grasping the target option @candidate_obj or any object if @candidate_obj is None.

Parameters:

Name Type Description Default
arm str

specific arm to check for grasping. Default is "default" which corresponds to the first entry in self.arm_names

'default'
candidate_obj StatefulObject or None

object to check if this robot is currently grasping. If None, then will be a general (object-agnostic) check for grasping. Note: if self.grasping_mode is "physical", then @candidate_obj will be ignored completely

None

Returns:

Type Description
IsGraspingState

For the specific manipulator appendage, returns IsGraspingState.TRUE if it is grasping (potentially @candidate_obj if specified), IsGraspingState.FALSE if it is not grasping, and IsGraspingState.UNKNOWN if unknown.

Source code in omnigibson/robots/manipulation_robot.py
def is_grasping(self, arm="default", candidate_obj=None):
    """
    Returns True if the robot is grasping the target option @candidate_obj or any object if @candidate_obj is None.

    Args:
        arm (str): specific arm to check for grasping. Default is "default" which corresponds to the first entry
            in self.arm_names
        candidate_obj (StatefulObject or None): object to check if this robot is currently grasping. If None, then
            will be a general (object-agnostic) check for grasping.
            Note: if self.grasping_mode is "physical", then @candidate_obj will be ignored completely

    Returns:
        IsGraspingState: For the specific manipulator appendage, returns IsGraspingState.TRUE if it is grasping
            (potentially @candidate_obj if specified), IsGraspingState.FALSE if it is not grasping,
            and IsGraspingState.UNKNOWN if unknown.
    """
    arm = self.default_arm if arm == "default" else arm
    if self.grasping_mode != "physical":
        is_grasping_obj = (
            self._ag_obj_in_hand[arm] is not None
            if candidate_obj is None
            else self._ag_obj_in_hand[arm] == candidate_obj
        )
        is_grasping = (
            IsGraspingState.TRUE
            if is_grasping_obj and self._ag_release_counter[arm] is None
            else IsGraspingState.FALSE
        )
    else:
        # Infer from the gripper controller the state
        is_grasping = self._controllers["gripper_{}".format(arm)].is_grasping()
        # If candidate obj is not None, we also check to see if our fingers are in contact with the object
        if is_grasping == IsGraspingState.TRUE and candidate_obj is not None:
            finger_links = {link for link in self.finger_links[arm]}
            is_grasping = len(candidate_obj.states[ContactBodies].get_value().intersection(finger_links)) > 0

    return is_grasping

n_arms()

Returns:

Type Description
int

Number of arms this robot has. Returns 1 by default

Source code in omnigibson/robots/manipulation_robot.py
@classproperty
def n_arms(cls):
    """
    Returns:
        int: Number of arms this robot has. Returns 1 by default
    """
    return 1

release_grasp_immediately()

Magic action to release this robot's grasp for all arms at once. As opposed to @_release_grasp, this method would bypass the release window mechanism and immediately release.

Source code in omnigibson/robots/manipulation_robot.py
def release_grasp_immediately(self):
    """
    Magic action to release this robot's grasp for all arms at once.
    As opposed to @_release_grasp, this method would bypass the release window mechanism and immediately release.
    """
    for arm in self.arm_names:
        if self._ag_obj_constraints[arm] is not None:
            self._release_grasp(arm=arm)
            self._ag_release_counter[arm] = int(math.ceil(m.RELEASE_WINDOW / og.sim.get_sim_step_dt()))
            self._handle_release_window(arm=arm)
            assert not self._ag_obj_in_hand[arm], "Object still in ag list after release!"

set_position_orientation(position=None, orientation=None, frame='world')

Sets manipulation robot's pose with respect to the specified frame

Parameters:

Name Type Description Default
position None or 3 - array

if specified, (x,y,z) position in the world frame Default is None, which means left unchanged.

None
orientation None or 4 - array

if specified, (x,y,z,w) quaternion orientation in the world frame. Default is None, which means left unchanged.

None
frame Literal

frame to set the pose with respect to, defaults to "world".parent frame

'world'
Source code in omnigibson/robots/manipulation_robot.py
def set_position_orientation(
    self, position=None, orientation=None, frame: Literal["world", "parent", "scene"] = "world"
):
    """
    Sets manipulation robot's pose with respect to the specified frame

    Args:
        position (None or 3-array): if specified, (x,y,z) position in the world frame
            Default is None, which means left unchanged.
        orientation (None or 4-array): if specified, (x,y,z,w) quaternion orientation in the world frame.
            Default is None, which means left unchanged.
        frame (Literal): frame to set the pose with respect to, defaults to "world".parent frame
        set position relative to the object parent. scene frame set position relative to the scene.
    """

    # Store the original EEF poses.
    original_poses = {}
    for arm in self.arm_names:
        original_poses[arm] = (self.get_eef_position(arm), self.get_eef_orientation(arm))

    # Run the super method
    super().set_position_orientation(position=position, orientation=orientation, frame=frame)

    # Now for each hand, if it was holding an AG object, teleport it.
    for arm in self.arm_names:
        if self._ag_obj_in_hand[arm] is not None:
            original_eef_pose = T.pose2mat(original_poses[arm])
            inv_original_eef_pose = T.pose_inv(pose_mat=original_eef_pose)
            original_obj_pose = T.pose2mat(self._ag_obj_in_hand[arm].get_position_orientation())
            new_eef_pose = T.pose2mat((self.get_eef_position(arm), self.get_eef_orientation(arm)))
            # New object pose is transform:
            # original --> "De"transform the original EEF pose --> "Re"transform the new EEF pose
            new_obj_pose = new_eef_pose @ inv_original_eef_pose @ original_obj_pose
            self._ag_obj_in_hand[arm].set_position_orientation(*T.mat2pose(hmat=new_obj_pose))

teleop_data_to_action(teleop_action)

Generate action data from teleoperation action data NOTE: This implementation only supports IK/OSC controller for arm and MultiFingerGripperController for gripper. Overwrite this function if the robot is using a different controller. Args: teleop_action (TeleopAction): teleoperation action data Returns: th.tensor: array of action data for arm and gripper

Source code in omnigibson/robots/manipulation_robot.py
def teleop_data_to_action(self, teleop_action) -> th.Tensor:
    """
    Generate action data from teleoperation action data
    NOTE: This implementation only supports IK/OSC controller for arm and MultiFingerGripperController for gripper.
    Overwrite this function if the robot is using a different controller.
    Args:
        teleop_action (TeleopAction): teleoperation action data
    Returns:
        th.tensor: array of action data for arm and gripper
    """
    action = super().teleop_data_to_action(teleop_action)
    hands = ["left", "right"] if self.n_arms == 2 else ["right"]
    for i, hand in enumerate(hands):
        arm_name = self.arm_names[i]
        arm_action = th.tensor(teleop_action[hand]).float()
        # arm action
        assert isinstance(self._controllers[f"arm_{arm_name}"], InverseKinematicsController) or isinstance(
            self._controllers[f"arm_{arm_name}"], OperationalSpaceController
        ), f"Only IK and OSC controllers are supported for arm {arm_name}!"
        target_pos, target_orn = arm_action[:3], T.quat2axisangle(T.euler2quat(arm_action[3:6]))
        action[self.arm_action_idx[arm_name]] = th.cat((target_pos, target_orn))
        # gripper action
        assert isinstance(
            self._controllers[f"gripper_{arm_name}"], MultiFingerGripperController
        ), f"Only MultiFingerGripperController is supported for gripper {arm_name}!"
        action[self.gripper_action_idx[arm_name]] = arm_action[6]
    return action