Revision f7bdeeefccdadbcf2754a6d5b466362d57ba9f8f (click the page title to view the current version)

Corner Detection

Changes from f7bdeeefccdadbcf2754a6d5b466362d57ba9f8f to 1d37f9102e4f9b33c36e66c5f0727b2c46eac7fb

---
title: Corner Detection
categories: session
---


**Reading** Ma 4.3 and 4.A

**Warning** The textbook starts Chapter 4 by discussing *tracking*,
which means that motion as a function of time is considered as well
as the image as a funciton of spatial co-ordinates.
This is a lot of concepts and quantities to process at the same time.
We will instead start by discussing features in a still image.
When we have a good idea of what features are and how they behave,
we shall introduce motion.

**Briefing** [Corner Lecture]()

# Exercises

## Learning Objectives

1.  What makes a feature in visual terms?
2.  What makes a feature in mathematical terms?
3.  How do we differentiate a sampled signal?
4.  How does the Harris corner detector work?

## Setup

We will be using opencv for today's exercises, you can install it with `pip install opencv-python`.
We might also use scipy, which can be downloaded the same way (`pip install scipy`)
We will use both opencv and scipy today.
If you have not installed them already, please do so now.

We will also be working on a image, e.g. the valve image from https://upload.wikimedia.org/wikipedia/commons/f/f0/Valve_original_%281%29.PNG .
```
pip install opencv-python
pip install scipy
```

We will also be working on a image, e.g. the 
[valve image from wikimedia commons](https://upload.wikimedia.org/wikipedia/commons/f/f0/Valve_original_%281%29.PNG).
Feel free to use your own images, and to test different images.
Load the image and convert it to grayscale with:

  ```python
  import cv2 as cv
```python
import cv2 as cv

  # Load image, replace "path" with the image path
  img = cv.imread("path", 1)
  # Convert img to BGR, then to grayscale
  img = cv.cvtColor(img, cv.COLOR_RGB2BGR)
  img_gray = cv.cvtColor(img, cv.COLOR_BGR2GRAY)
  ```
# Load image, replace "path" with the image path
img = cv.imread("path", 1)
# Convert img to BGR, then to grayscale
img = cv.cvtColor(img, cv.COLOR_RGB2BGR)
img_gray = cv.cvtColor(img, cv.COLOR_BGR2GRAY)
```

## Exercise 1

Learning goal: 1D derivatives and 1D convolutions

**Part 1**<br>
Extract one row from the grayscale image, and visualize it (e.g. with matplotlib.pyplot).<br>
Does the values correspond to what you would expect from the row?

**Part 2**<br>
Convolve the row with a $[1/2,-1/2]$ kernel, using numpy, and visualize.<br>
Does the values make sense?

<details>
  <summary>Hint 1 (Click to expand)</summary>
  </br>
  Create a new 1D array with `np.zeros(<shape>)` and iterate over the row and the kernel.
  Remember that the resulting array should be smaller than the original.
  </details>
&nbsp;
<details>
  <summary>Hint 2 (Click to expand)</summary>
  </br>
  If you are not able to get a result using numpy, use scipy.signal (can also be used to compare your result):
  ```python
  from scipy import signal
  row_d = signal.convolve(row, kernel)
  # or
  # row_d = signal.correlate(row, cv.flip(kernel, -1))
  ```
</details>
&nbsp;
<details>
  <summary>Hint 3 (Click to expand)</summary>
  </br>
  If you use cross-correlation instead of convolution, flip the kernel.
</details>
&nbsp;

**(Optional, if you have time) Part 3**<br>
Repeat part 2 with an image column (instead of row) and the transpose of the kernel.

## Exercise 2

Learning goals: Blur filters

**Part 1**<br>

$$1/16 \begin{bmatrix}
        1 & 2 & 1 \\
        2 & 4 & 2 \\
        1 & 2 & 1 \\
      \end{bmatrix}
$$

Works as an approximation of a $3x3$ gaussian blur filter.<br>
Using `scipy.signal.convolve2d`, `scipy.signal.correlate2d` or `cv.filter2d`, apply the filter
to the entire grayscale image and either show the image or write it to file with `cv.imwrite`.
&nbsp;
How does the filter affect the image?

**Part 2**<br>
Compare the above result with the result from `cv.GaussianBlur(img_gray, (3, 3), -1)`

**(Optional) Part 3**<br>
Using the function from part 2, test out different kernel sizes and compare the difference.

## Exercise 3

Learning goals: 2D Derivatives

**Part 1**<br>
Apply the sobel operator
$$G_x = 1/8 \begin{bmatrix}
        1 & 0 & -1 \\
        2 & 0 & -2 \\
        1 & 0 & -1 \\
      \end{bmatrix}
$$

as you did in exercise 2.1.<br>
How does this relate to the 1D derivative you did in exercise 1.2?
as you did in exercise 2.1.
This should give you the derivative $I_x$ of the image $I$ with
respect to $x$.

+ How does this relate to the 1D derivative you did in exercise 1.2?
+ What are the minimum and maximum values of the $I_x$ matrix?

**Part 2**<br>
Repeat part 1 with $G_y$
Show $I_x$ as an image.
You probably have negative pixel values, so you may have to 
scale the image.

+ Try to take the absolute values of the luminence values.
+ Try to scale the luminences into the $0\ldots255$ range,
  e.g. by adding $255$ and dividing by two.
+ What does the different visualisations tell you?
+ You may scale further to use the full $0\ldots255$ range and thus
  increas contrast.

**Part 3**<br>
Repeat Parts 1 and 2 with vertical derivation, i.e. use $G_y$ instead
of $G_x$.

$$G_y = 1/8 \begin{bmatrix}
        1 & 2 & 1 \\
        0 & 0 & 0 \\
        -1 & -2 & -1 \\
      \end{bmatrix}
$$

**Part 3**<br>
Compute and visualize the gradient magnitude of the image, using the results from part 1 and 2.

**(Optional, only if you have time) Part 4**<br>
+ Compare the images.  What differences can you make out?

Using the same method as in exercise 1.2 and 1.3, compute the gradient of all
rows and columns, and compute the magnitude (as in exercise 3.3), compare with the
magnitude from 3.3.
## Exercise 3

## Exercise 4

Learning goals: Introduction to harris corner detector

**Part 1**<br>

Consider the grayscale image we have been working with, and the gradient magnitude from 3.3.<br>
Where do you expect to find corners?

Apply opencv's built in harris-detector.<br>
E.g. `cv.cornerHarris(img_gray, block_size, kernel_size, k)` with `block_size = 2`, `kernel_size = 5` and `k = 0.06`.<br>
Here, block_size is the size of neighbourhood considered for corner detection, kernel_size is the size of the sobel derivative kernel, while k is the harris free parameter.

Make a copy of the original image (with colors) and make circles around any corners found by the harris-detector.<br>
Example code for drawing circles is added below.

<details>
  <summary>Code</summary>
  </br>
  ```python
  cx = cv.cornerHarris(img_gray, bsize, ksize, k)

  T = 0.1 # Threshold
  c_image = img

  for i in range(c_image.shape[0]):
      for j in range(c_image.shape[1]):
          if c_x[i, j] > T:
              cv.circle(c_image, (j, i), 2, (0, 0, 255), 2)
  ```
</details>
&nbsp;

Save/visualize the result, how does it compare with your expectations?

**Part 2**<br>
Adjust the threshold `T` when drawing circles, what does this do?

**(Optional) Part 3**<br>
Adjust the kernel_size (must be positive and odd), block_size and/or k, and observe how they change the result.

## Optional Exercise

### Blur Filters

Learning goals: Blur filters

**Part 1**<br>

$$\frac{1}{16} \begin{bmatrix}
        1 & 2 & 1 \\
        2 & 4 & 2 \\
        1 & 2 & 1 \\
      \end{bmatrix}
$$

Works as an approximation of a $3\times 3$ gaussian blur filter.<br>
Using `scipy.signal.convolve2d`, `scipy.signal.correlate2d` or `cv.filter2d`, apply the filter
to the entire grayscale image and either show the image or write it to file with `cv.imwrite`.
&nbsp;
How does the filter affect the image?

**Part 2**<br>
Compare the above result with the result from `cv.GaussianBlur(img_gray, (3, 3), -1)`

**(Optional) Part 3**<br>
Using the function from part 2, test out different kernel sizes and compare the difference.


### Building on Exercise 1

Repeat part 2 with an image column (instead of row) and the transpose of the kernel.

### Building on Exercise 2

**(Optional, only if you have time) Part 4**<br>

Using the same method as in exercise 1.2 and 1.3, compute the gradient of all
rows and columns, and compute the magnitude (as in exercise 3.3), compare with the
magnitude from 3.3.