Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sensor parsing #82

Closed
rhaschke opened this issue Mar 9, 2016 · 8 comments
Closed

sensor parsing #82

rhaschke opened this issue Mar 9, 2016 · 8 comments

Comments

@rhaschke
Copy link
Contributor

rhaschke commented Mar 9, 2016

urdfdom has code to parse <sensor> tags according to this spec.

However, this code doesn't seem to be used. It goes to liburdfdom_sensor, but as symbols are hidden by default, it cannot be used externally.

We are currently in the process of proposing a new Tactile sensor and would like to add the list of parsed sensors to the ModelInterface (see here for our proposal). Any objections to that strategy? Should the robot model be kept as lean as possible instead, leaving the sensor parsing to some separate function returning a list of sensors?

Please comment on the chosen approach and we will adjust accordingly and eventually file a PR.

@rhaschke
Copy link
Contributor Author

Thinking deeper on the topic and considering discussion in #28, I now believe, that we need a plugin mechanism to handle the infinite number of available sensors out there. I will further think about that and make a proposal.

@jacquelinekay
Copy link
Contributor

Thanks @rhaschke, I was composing a response pointing to that discussion but you beat me to it. :)

The discussion for adding sensors to URDF has surfaced and resurfaced a few times. As you suggested, the difficulty of supporting sensors is choosing the right abstractions.

SDF supports sensors because Gazebo can simulate a specific set of sensors (laser range finders, cameras, etc.).

One question I have is how sensors in URDF will be integrated with the rest of the ROS toolchain. For example, when a robot with a LaserScan tag in its URDF is visualized in Rviz, will Rviz know to add a LaserScan visualization with the correct transform extracted from the URDF? These kinds of questions might help motivate the proposal and make it general enough to benefit the whole community.

I'm open to reviewing new proposals for sensors in URDF. However we should consider the development overhead of the initial implementation as well as the integration with our tooling. A bit of time and effort investment in the design may help reduce that overhead.

@rhaschke
Copy link
Contributor Author

One question I have is how sensors in URDF will be integrated with the
rest of the ROS toolchain. For example, when a robot with a LaserScan
tag in its URDF is visualized in Rviz, will Rviz know to add a
LaserScan visualization with the correct transform extracted from the
URDF? These kinds of questions might help motivate the proposal and
make it general enough to benefit the whole community.

Currently, we are not (yet) thinking about an automated creation of
additional displays in rviz. But our work points along a similar direction:
Given you add a sensor stream visualization to rviz (in our current case
tactile data), the associated display plugin will inspect the urdf to
know about sensor transform and other sensor parameters, which are fixed
and therefore are not published in the sensor stream.

I'm open to reviewing new proposals for sensors in URDF. However we
should consider the development overhead of the initial implementation
as well as the integration with our tooling. A bit of time and effort
investment in the design may help reduce that overhead.

Thanks, I will keep you updated ;-)

@kavonszadkowski
Copy link

I'm less of a ROS insider and come from a simulation-biased perspective, but I second Robert's notion of introducing a plugin mechanism. In MARS we annotate URDF with information about sensors using SMURF and use factories to create them in simulation.

But maybe that's something more fundamental that may apply to other elements beyond sensors and thus should be discussed along with other improvements for "URDF 2.0"?

@jacquelinekay
Copy link
Contributor

I have a lot of ideas about extensibility/a plugin interface for the "next generation" of RDFs. I am writing them up and will post them on a public forum (and notify you both when the post is finished).

@traversaro
Copy link
Contributor

@rhaschke some times ago we set up a ROS SIG on Robotic Skins [1].

While activity on the SIG has been relativly low, I think you could find useful feedback on the semantics of your proposal on how to describe tactile sensors information, even before that discussion on how to include sensors information on the "next generation of *RDFs" is completed.

[1] : http://wiki.ros.org/sig/RobotSkin

@kavonszadkowski
Copy link

@sachinchitta has put together some notes on the topic of future RDFs and has pulled a number of my suggestions, again coming from SMURF. I haven't yet found time to work in the parallels between SDF and SMURF, but a number of ideas are in there.

It's on a github page rendered from this repository.

Perhaps we could move that repo into ros and use it as a starting point for some kind of white paper or alternatively use a github wiki to put our notes together?

@jacquelinekay
Copy link
Contributor

Some thoughts here:
http://discourse.ros.org/t/urdf-ng-parser-tools-wishlist/55

We would need @sachinchitta's permission to move the repo into the ROS org and I wouldn't want to fork it into the organization without asking him first :) Sachin, do you currently have any interest in helping to push major changes to URDF/whatever robot description format is used in ROS? It would be great to have your input.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants