Revision d694765af9eb41df185817f2e18dad4938eb88f3 (click the page title to view the current version)

Edge Lecture

Changes from d694765af9eb41df185817f2e18dad4938eb88f3 to current

---
title: Lecture: Edge Detection
categories: lecture
---

# Differentiation - The Canny edge detector

+ $\nabla I = [ I_x, I_y ]$ is the gradient vector.
+ It has length and direction in each pixel in the image.
+ Length $||\nabla I(x,y)||^2= \nabla I^T\nabla I$ 
+ Select points which satisfy two criteria
    + Local optimum *along the direction of the gradient*
    + Larger than a chosen threshold $\tau$.
    + Sometimes we use a soft and a hard threshold, where
      points between the two thresholds are selected if they
      are adjacent to other selected points.
+ We can calculate $\nabla I$ with either Sobel or the derivative
  of a Gaussian.

# Connected Components

1.  Start with a singleton set (single pixel).
2.  Use a mask, typically a $3\times3$ patch and centre it at
    each pixel already selected.
3.  Pixels in the mask *and* the original image are added to the
    set.
4.  Iterate until no pixels are added.

# Line Fitting

1.  Consider each connected component by itself;
    each one is a set of pixels with $(x,y)$ co-ordinates.
2.  Calculate the centre, that is the mean $(\bar x,\bar y)$.
2.  Calculate pixel positions relative tp the, i.e.
     $(\tilde x_i,\tilde y_i)$ where $\tilde x_i=x_i-\bar x$ and
     $\tilde y_i=y_i-\bar y$.
3.  Consider the matrix
    $$D=
      \begin{bmatrix}
         \sum_i \tilde x_i^2 & \sum_i \tilde x_i\tilde y_i \\
         \sum_i \tilde x_i\tilde y_i & \sum_i \tilde y_i^2 &
      \end{bmatrix}
    $$
4.  Suppose as an example that the $\tilde y_i$ are (approximately)
    zero and that there are many large $\tilde x_i$.
    + the matrix has one zero eigenvector and one eigenvector in the
      direction of $x$.
    + the points form an edge in the $x$ direction
5.  If the edge is rotated both $\tilde x_i$ and $\tilde y_i$ are
    non-zero, but the eigenvectors behave the same.
    + One zero eigenvector
    + One eigenvector in the direction of the edge.
6.  If the line is not straight, this is not perfect and the smaller
    eigenvalue is also non-zero.
7.  The line is described by $y\sin\theta-x\cos\theta=\rho$, where
    $\rho$ is the distance between the line and the origin.
    See drawing in [Hough Tutorial](https://docs.opencv.org/3.4/d9/db0/tutorial_hough_lines.html)
    + Note the two points on the line at
      $$\big(0,\frac{r}{\sin\theta}\big) \quad\text{and}\quad \big(\frac{r}{\cos\theta}\big)$$
    + Writing $y=\alpha x + \beta$,  we obviously have
      $$\beta=\frac{r}{\sin\theta}$$
    + We can solve for $\alpha$ to get
      $$\beta=-\frac{r}{\cos\theta}$$

# Hough