Skip to content

Weird edge detection behavior when the blurred image border pixel extrapolation method changes #1035

@Rektino

Description

@Rektino

Expected behaviour

The canny edge detector should yield very similar images when the blurred image has two different pixel extrapolation methods for the border, since only the border is affected by this.

Actual behaviour

Edge detection is being affected by the selected method. When I select cv.BORDER_REPLICATE I get many more edges than when cv.BORDER_DEFAULT is selected. This does not make sense.

Steps to reproduce

Code running on Windows 11
OpenCV version : 4.10.0
Python version : 3.11.4

CODE :

img = cv.imread('../Photos/park.jpg')
#Gaussian blur
blur1 = cv.GaussianBlur(img , (7,7) , cv.BORDER_REPLICATE)
blur2 = cv.GaussianBlur(img, (7,7) , cv.BORDER_DEFAULT)
cv.imshow('Original' , img)
cv.imshow('Blurred 1' , blur1)
cv.imshow('Blurred 2' , blur2)

#Canny edge detector:
canny1 = cv.Canny(blur1 , 150,210)
cv.imshow('Canny (1) edges w/ 150-210' , canny1)

canny2 = cv.Canny(blur2 , 150 , 210)
cv.imshow('Canny (2) edges w/ 150-210' , canny2)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions