Skip to content
li9i edited this page Oct 22, 2014 · 28 revisions

Function

The Hole Fusion node is responsible for turning information about candidate holes into certainties about the presence and whereabouts of the presence of distinct holes in space.

Information aggregation

The Hole Fusion node, before being able to start its function, must wait for the Synchronizer, Depth and RGB nodes to publish their pieces of information to it. Upon receiving the point cloud from the Synchronizer node, the Hole Fusion node denoises the depth proportion of the point cloud, so as to facilitate the application of validators, filters in other words, that give us a picture about how much valid a given candidate hole is.

Upon receiving the candidate holes detected by the Depth and Rgb nodes, the denoised depth image and the input RGB image, the Hole Fusion node stores the images, transforms the information received about candidate holes in a meaningful way for it and stores them.

When all three nodes have published what they should to the Hole Fusion node, it unlocks the Synchronizer node, so as to begin a new cycle of candidate-holes-detection, and then carries on to its chores.

Hole Merging

Let's consider a physical hole, detected by the Depth and Rgb nodes. Due to the imperfect nature of the depth sensor and imperfections in the analyzes undertaken, the resulting candidate holes, by either node, may be fragmented or distorted in an uncertain degree. Thus, in order to obtain the most accurate holes' outline, holes by each analysis need to be merged.

Three operations of merging can be identified between two entities named "holes":

  • Assimilation, where a hole's outline is completely enveloped by the other's outline,

  • Amalgamation, where the two holes' outlines overlap each other, and

  • Connection, where the two holes' outlines do not meet each other.

Each operation is performed sequentially, until all possible merges occur. If depth analysis is possible, the the result of each merger is checked against credible depth filters, in order to evaluate the merger's validity. If depth analysis is not possible, each merger is confirmed unconditionally.

Hole Sifting

Each hole at hand, so far, is a candidate hole. In order to ascertain a candidate hole's validity, a series of Depth-based and/or RGB-based filters has to be applied (Depth-based filters can only be applied when depth analysis is possible).

Succinctly, nine filters have been implemented:

  • Depth / area
  • Depth difference
  • Depth homogeneity
  • Bounding rectangle plane constitution
  • Intermediate points plane constitution
  • Luminosity difference
  • Colour homogeneity
  • Texture difference
  • Texture backprojection

Each filter produces a validity probability per hole. Note that each probability is in a scale of its own, meaning that, empirically and experimentally, a probability of 0.2 may be a low one for a particular filter and a high one for another.

Let's run through them.

Depth-based filters

  • Depth / area

A mathematical function f(mean / area) between the mean depth inside a candidate hole and its area was crafted, precisely for the holes found in the arenas of the RoboCup competition. A particular candidate hole is considered valid, and given a probability of 1.0, if

min_area < f(mean / area) < max_area
  • Depth difference

The depth difference filter checks the difference in depth between a candidate hole's keypoint d_k and the mean depth of the vertices of its bounding box d_v. A candidate hole is considered valid if

min_depth_cutoff < d_k - d_v < max_depth_cutoff

Reasonably, min_depth_cutoff is set to a small positive floating point value and max_depth_cutoff to the maximum aforementioned depth difference witnessed in the valid holes of the RoboCup competition arenas.

The validity probability of a candidate hole that passes through the above check can be chosen between a normalized gaussian distribution, for given values of mean and standard deviation, or a binary ascertainer.

  • Depth homogeneity

A valid hole is expected to have objects in it. If edges are detected inside the outline of a candidate hole, the ratio of non-zero points inside the candidate hole's outline to the total number of points in it is returned as a validity probability.

  • Bounding rectangle plane constitution

A valid hole is expected to lie on a planar surface. This filter returns as a validity probability the ratio of points of a candidate hole's bounding rectangle that lie on the same plane, to the total number of its points.

  • Intermediate points plane constitution

This is variation of the above filter, one that is more thorough and thus more strict. Intermediate points are called the points that are internal to a hole's bounding rectangle but external to its outline. The validity probability of a candidate hole is the ratio of the number of intermediate points that lie on the same plane, to the total number of intermediate points.

  • Luminosity difference

A valid hole is expected to be one that the brightness of the points inside its outline is significantly lower than that of the points outside it. The validity probability of a candidate hole is the difference between the mean luminosity of the points inside its bounding rectangle but outside its outline, and the mean luminosity of the points inside its outline.

  • Colour homogeneity

A valid hole is expected to feature a variety of colours inside its outline, in contrast to an invalid one, which is either through-and-through, in which case what is on the other side is likely another piece of boring wooden wall, or void, in which case the RGB sensor sees a colour-wise homogenous region. The colours inside a candidate hole's outline are quantized and then the number of different colour values are counted per colour component. The validity probability of a candidate hole is the ratio of the different colours seen, to the total number of quantized colours that can be seen.

  • Texture difference

The texture of a valid hole's points beyond its outline are expected to be wooden, while the points inside it are not. The histograms of the image of the points inside the outline of a hole and the image of the points outside it but inside its bounding box are compared against a model histogram of images of walls. A candidate hole is considered valid if the correlation of the former histogram to the model histogram is more loose than that of the correlation of the latter to the model histogram. The validity probability of a candidate hole is then the difference between the two correlations.

  • Texture backprojection

The texture of a valid hole's points beyond its outline are expected to be wooden, while the points inside it are not. In order to ascertain the validity of candidate holes, at first, the backprojection of the input RGB image is obtained. Then, the value of the points inside the candidate hole's outline and the points outside it but inside its bounding box are summed and compared. If the latter is greater than the former, the validity probability of a candidate hole is assigned to their difference.


Essentially, the sifting process produces a 2D array of probabilities: each row holds the validity probabilities per candidate hole, for one particular filter.

 | H0 | H1 | ... | Hm-1

-----|----|----|-----|---- F0 | | | | F1 | | | |
.. | | | | Fn-1 | | | |

The 'x' in 'Fx' represents the order of execution of filter F.

Filters' resources

Above, you might have observed that there are resources that the filters need to use, like the intermediate points, or the histogram of the image of points inside holes' outlines. Because the relation between a resource and a filter is not 1-1, in order to save time, resources are centrally created once per processing cycle, rather than being generated by each filter and used exclusively by it.

Hole Validation

Now that the validity probability of all candidate holes has been obtained, the only thing left is to find out which ones are actually valid. Two distinct validation processes have been implemented, and one that combines them, to produce a more strict validator.

Validation via thresholding

The simplest, most reasonable and most effective validation process is to check each validity probability of each candidate hole against a per-filter threshold. If all the probabilities referring to a candidate hole are found to exceed their respective threshold, the candidate hole is deemed valid. The overall probability of the, now valid, hole is set to the mean of the per-filter probabilities.

As stated above, each validity probability is in a scale of its own. So, the thresholds are set via appropriate experimental procedures, covering all foreseen conditions that are likely to distort the validity outcome, like trembling of the depth sensor, variations in lighting etc.

Validation via weighting

In order to produce a meaningful overall validity probability, each validity probability obtained from each filter can be weighted. A reasonable way to justify weighting is that each filter has a different credibility, due to imperfections in the detection of holes' outlines, the quality of the RGB image, the denoising of the depth image etc. Also, for example, empirically, the depth difference filter is more prestigious than, say, the texture difference filter, as it gives a more concrete answer about the validity of a candidate hole.

Let's consider that N filters are enabled. A candidate hole's overall probability can be extracted by the formula:

equation

where p(h) is the overall validity probability of a candidate hole, and p_i(h) the validity probability of a candidate hole referring to filter i.

So, each weight is

equation

So the question is how to assign exponents to filters. This is the interesting bit.

First, a dataset of validity probabilities mapped to actual responses about the validity of each candidate hole is constructed. That means that each row of the dataset features N probabilities and a class value of 1 or 0, depending on if that particular candidate hole is valid or not, respectively. The arrangement of the filters' order is set beforehand to a default one. The task is to find out which arrangement is suited best, in order for the above formula to decisively point to the validity of each candidate hole and separate valid holes from the invalid. The value of the exponent for each filter in the arrangement is the position of the filter in the arrangement, minus one (0 value means that a filter is not enabled).

Once the dataset is constructed, it should look something like this:

f0 f1 ... fN-1 C
p00 p01 ... p0N-1 C0
p10 p11 ... p2N-1 C1
... ... ... ... ...
pm0 pm1 ... pmN-1 Cm

where the 'x' in 'Fx' will be the exponent of filter F. eg, for this setting:

equation

for hole '0'.

Hence, the assignment of exponents to filters means finding the optimal filtering order, meaning the optimal arrangement. For N active filters, there are N! permutations, meaning N! arrangements of filters. The optimal permutation is the one with the least mean squared error, for a given dataset.

The permutation with the least mean squared error over the given dataset will direct the the order of the filters to their optimal arrangement. The final stage is to apply a threshold to the overall probabilities, which must be set by hand.

Validation via thresholded weighting

The third and final validation process involves both the above validation processes. A candidate hole is considered valid if and only if all of its validity probabilities exceed a given threshold for each one, and its overall validity probability, given by the formula in the weighted validation process exceeds an overall validity threshold.

Making valid holes unique

Although, at this point, the valid holes have been found, due to the fact that candidate holes are detected from separate points (the Depth and Rgb nodes), there may exist valid holes that refer to the same physical hole in space. So, at this point, the valid hole with the greatest overall validity probability is chosen among those that refer to the same physical hole. After that, information about unique valid holes is published to the alert handler, for higher level processing.

Interfaces

Input

Under

hole_fusion_node_topics.yaml:subscribed_topics
  • depth_candidate_holes_topic is the name of the topic where the Hole Fusion node is subscribed to, and where the Depth node publishes information about the candidate holes it has found. It must comply with
depth_node_topics.yaml:published_topics/candidate_holes_topic
  • rgb_candidate_holes_topic is the name of the topic where the Hole Fusion node is subscribed to, and where the Rgb node publishes information about the candidate holes it has found. It must comply with
rgb_node_topics:published_topics/candidate_holes_topic
  • point_cloud_internal_topic is the name of the topic where the Hole Fusion node is subscribed to, and where the Synchronizer node publishes directly its input point cloud. It must comply with
synchronizer_node_topics:published_topics/point_cloud_internal_topic

Output

Under

hole_fusion_node_topics:published_topics
  • synchronizer_unlock_topic is the name of the topic where the Hole Fusion node directs the Synchronizer node into obtaining a new point cloud from the depth sensor. It must comply with
synchronizer_node_topics::subscribed_topics/unlock_topic
  • make_synchronizer_subscribe_to_input

is the name of the topic where the Hole Fusion node publishes messages to the Synchronizer node so that it subscribes to its input topic where point clouds are published. Used for transitioning from an OFF state to an ON state. It must comply with

synchronizer_node_topics::subscribed_topics/subscribe_to_input
  • make_sunchronizer_leave_subscription_to_input is the name of the topic where the Hole Fusion publishes messages to the Synchronizer node so that it can unsubscribe from the topic where the depth sensor publishes point clouds. This happens when the Hole Detector transitions from an ON state to an OFF state, so that it does not consume any processing resources. It must comply with
synchronizer_node_topics::subscribed_topics/leave_subscription_to_input
  • hole_detector_output_topic is the name of the topic where the Hole Fusion node publishes information about the valid holes found to the alert handler.

  • enhanced_holes_topic is the name of the topic where the Hole Fusion node publishes additional information about the valid holes found to the Victim node.

  • debug_valid_holes_image is the name of the topic where the Hole Fusion node publishes images of valid holes found, on top of the input RGB image. Used as such:

rosrun image_view image_view image:=$NAMESPACE/hole_detector/debug_valid_holes_image _image_transport:=compressed

$NAMESPACE=pandora_vision for standalone and $NAMESPACE=vision for robot mode.

  • debug_respective_holes_image is the name of the topic where the Hole Fusion node publishes images of candidate holes found, on top of the input Depth and RGB images. Used as such:
rosrun image_view image_view image:=$NAMESPACE/hole_detector/debug_respective_holes_image _image_transport:=compressed

$NAMESPACE=pandora_vision for standalone and $NAMESPACE=vision for robot mode.

Metapackages

###pandora_audio

  • [Audio capture node](Audio Capture Node)
  • [Audio monitoring node](Audio Monitoring Node)
  • [Audio recording node](Audio Recording Node)
  • [Audio processing node](Audio Processing Node)
Clone this wiki locally