<output id="qn6qe"></output>

    1. <output id="qn6qe"><tt id="qn6qe"></tt></output>
    2. <strike id="qn6qe"></strike>

      亚洲 日本 欧洲 欧美 视频,日韩中文字幕有码av,一本一道av中文字幕无码,国产线播放免费人成视频播放,人妻少妇偷人无码视频,日夜啪啪一区二区三区,国产尤物精品自在拍视频首页,久热这里只有精品12

      (六)OpenCV-Python學(xué)習(xí)—邊緣檢測(cè)2

         在上一節(jié)中都是采用一階差分(導(dǎo)數(shù)),進(jìn)行的邊緣提取。 也可以采用二階差分進(jìn)行邊緣提取,如Laplacian算子,高斯拉普拉斯(LoG)邊緣檢測(cè), 高斯差分(DoG)邊緣檢測(cè),Marr-Hidreth邊緣檢測(cè)。這些邊緣提取算法詳細(xì)介紹如下:

       

      1. Laplacian算子

        Laplacian算子采用二階導(dǎo)數(shù),其計(jì)算公式如下:(分別對(duì)x方向和y方向求二階導(dǎo)數(shù),并求和)

         其對(duì)應(yīng)的Laplacian算子如下:

         其推導(dǎo)過程如下:

        opencv中提供Laplacian()函數(shù)計(jì)算拉普拉斯運(yùn)算,其對(duì)應(yīng)參數(shù)如下:

      dst = cv2.Laplacian(src, ddepth, ksize, scale, delta, borderType)
          src: 輸入圖像對(duì)象矩陣,單通道或多通道
          ddepth:輸出圖片的數(shù)據(jù)深度,注意此處最好設(shè)置為cv.CV_32F或cv.CV_64F
          ksize: Laplacian核的尺寸,默認(rèn)為1,采用上面3*3的卷積核
          scale: 放大比例系數(shù)
          delta: 平移系數(shù)
          borderType: 邊界填充類型

         下面為使用代碼及其對(duì)應(yīng)效果:

      #coding:utf-8
      
      import cv2
      img_path= r"C:\Users\silence_cho\Desktop\Messi.jpg"
      img = cv2.imread(img_path)
      img_gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
      
      dst_img = cv2.Laplacian(img, cv2.CV_32F)
      laplacian_edge = cv2.convertScaleAbs(dst_img)  #取絕對(duì)值后,進(jìn)行歸一化
      
      dst_img_gray = cv2.Laplacian(img_gray, cv2.CV_32F)
      laplacian_edge_gray = cv2.convertScaleAbs(dst_img_gray)  #取絕對(duì)值后,進(jìn)行歸一化
      
      cv2.imshow("img", img)
      cv2.imshow("laplacian_edge", laplacian_edge)
      cv2.imshow("img_gray", img_gray)
      cv2.imshow("laplacian_edge_gray ", laplacian_edge_gray)
      cv2.waitKey(0)
      cv2.destroyAllWindows()
      cv2.Laplacian()

       

        Laplacina算子進(jìn)行邊緣提取后,可以采用不同的后處理方法,其代碼和對(duì)應(yīng)效果如下:

      #coding:utf-8
      import cv2
      import numpy as np
      
      img_path= r"C:\Users\silence_cho\Desktop\Messi.jpg"
      img = cv2.imread(img_path)
      img_gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
      dst_img_gray = cv2.Laplacian(img_gray, cv2.CV_32F)
      
      # 處理方式1
      laplacian_edge = cv2.convertScaleAbs(dst_img_gray)  #取絕對(duì)值后,進(jìn)行歸一化
      # convertScaleAbs等同于下面幾句:
      # laplacian_edge = np.abs(laplacian_edge)
      # laplacian_edge = laplacian_edge/np.max(laplacian_edge)
      # laplacian_edge = laplacian_edge*255  #進(jìn)行歸一化處理
      # laplacian_edge = laplacian_edge.astype(np.uint8)
      
      # 處理方式2
      laplacian_edge2 = np.copy(laplacian_edge)
      # laplacian_edge2[laplacian_edge > 0] = 255
      laplacian_edge2[laplacian_edge > 255] = 255
      laplacian_edge2[laplacian_edge <= 0] = 0
      laplacian_edge2 = laplacian_edge2.astype(np.uint8)
      
      
      #先進(jìn)行平滑處理
      gaussian_img_gray = cv2.GaussianBlur(dst_img_gray, (3, 3), 1)
      laplacian_edge3 = cv2.convertScaleAbs(gaussian_img_gray)  #取絕對(duì)值后,進(jìn)行歸一化
      
      cv2.imshow("img_gray", img_gray)
      cv2.imshow("laplacian_edge", laplacian_edge)
      cv2.imshow("laplacian_edge2", laplacian_edge2)
      cv2.imshow("laplacian_edge3", laplacian_edge3)
      cv2.waitKey(0)
      cv2.destroyAllWindows()
      Laplacian進(jìn)行不同后處理

       

       

      2. 高斯拉普拉斯(LoG)邊緣檢測(cè)

        拉普拉斯算子沒有對(duì)圖像做平滑處理,會(huì)對(duì)噪聲產(chǎn)生明顯的響應(yīng),所以一般先對(duì)圖片進(jìn)行高斯平滑處理,再采用拉普拉斯算子進(jìn)行處理,但這樣要進(jìn)行兩次卷積處理。高斯拉普拉斯(LoG)邊緣檢測(cè),是將兩者結(jié)合成一個(gè)卷積核,只進(jìn)行一次卷積運(yùn)算。

      高斯拉普拉斯(LoG),指的是二維高斯函數(shù)的拉普拉斯變換,其推導(dǎo)公式如下:

        下面為一個(gè)標(biāo)準(zhǔn)差為1,3*3的LoG卷積核示例:

        用python實(shí)現(xiàn)高斯拉普拉斯LoG,代碼及其對(duì)應(yīng)效果如下:

      #coding:utf-8
      import numpy as np
      from scipy import signal
      import cv2
      
      
      def createLoGKernel(sigma, size):
          H, W = size
          r, c = np.mgrid[0:H:1.0, 0:W:1.0]
          r -= (H-1)/2
          c -= (W-1)/2
          sigma2 = np.power(sigma, 2.0)
          norm2 = np.power(r, 2.0) + np.power(c, 2.0)
          LoGKernel = (norm2/sigma2 -2)*np.exp(-norm2/(2*sigma2))  # 省略掉了常數(shù)系數(shù) 1\2πσ4
      
          print(LoGKernel)
          return LoGKernel
      
      def LoG(image, sigma, size, _boundary='symm'):
          LoGKernel = createLoGKernel(sigma, size)
          edge = signal.convolve2d(image, LoGKernel, 'same', boundary=_boundary)
          return edge
      
      
      if __name__ == "__main__":
          img_path= r"C:\Users\silence_cho\Desktop\Messi.jpg"
          img = cv2.imread(img_path, 0)
          LoG_edge = LoG(img, 1, (11, 11))
          LoG_edge[LoG_edge>255] = 255
          # LoG_edge[LoG_edge>255] = 0
          LoG_edge[LoG_edge<0] = 0
          LoG_edge = LoG_edge.astype(np.uint8)
      
          LoG_edge1 = LoG(img, 1, (37, 37))
          LoG_edge1[LoG_edge1 > 255] = 255
          LoG_edge1[LoG_edge1 < 0] = 0
          LoG_edge1 = LoG_edge1.astype(np.uint8)
      
          LoG_edge2 = LoG(img, 2, (11, 11))
          LoG_edge2[LoG_edge2 > 255] = 255
          LoG_edge2[LoG_edge2 < 0] = 0
          LoG_edge2 = LoG_edge2.astype(np.uint8)
      
          cv2.imshow("img", img)
          cv2.imshow("LoG_edge", LoG_edge)
          cv2.imshow("LoG_edge1", LoG_edge1)
          cv2.imshow("LoG_edge2", LoG_edge2)
          cv2.waitKey(0)
          cv2.destroyAllWindows()
      高斯拉普拉斯LoG邊緣檢測(cè)

       

      3. 高斯差分(DoG)邊緣檢測(cè)

        高斯差分(Difference of Gaussian, DoG), 是高斯拉普拉斯(LoG)的一種近似,兩者之間的關(guān)系推導(dǎo)如下:

        高斯差分(Difference of Gaussian, DoG)邊緣檢測(cè)算法的步驟如下:

      • 構(gòu)建窗口大小為HxW,標(biāo)準(zhǔn)差為的DoG卷積核(H, W一般為奇數(shù),且相等)

      • 圖像與兩個(gè)高斯核卷積,卷積結(jié)果計(jì)算差分

      • 邊緣后處理

         python代碼實(shí)現(xiàn)DoG邊緣提取算法, 代碼和結(jié)果如下:

      #coding:utf-8
      
      import cv2
      import numpy as np
      from scipy import signal
      
      # 二維高斯卷積核拆分為水平核垂直一維卷積核,分別進(jìn)行卷積
      def gaussConv(image, size, sigma):
          H, W = size
          # 先水平一維高斯核卷積
          xr, xc = np.mgrid[0:1, 0:W]
          xc = xc.astype(np.float32)
          xc -= (W-1.0)/2.0
          xk = np.exp(-np.power(xc, 2.0)/(2*sigma*sigma))
          image_xk = signal.convolve2d(image, xk, 'same', 'symm')
      
          # 垂直一維高斯核卷積
          yr, yc = np.mgrid[0:H, 0:1]
          yr = yr.astype(np.float32)
          yr -= (H-1.0)/2.0
          yk = np.exp(-np.power(yr, 2.0)/(2*sigma*sigma))
          image_yk = signal.convolve2d(image_xk, yk, 'same','symm')
          image_conv = image_yk/(2*np.pi*np.power(sigma, 2.0))
      
          return image_conv
      
      #直接采用二維高斯卷積核,進(jìn)行卷積
      def gaussConv2(image, size, sigma):
          H, W = size
          r, c = np.mgrid[0:H:1.0, 0:W:1.0]
          c -= (W - 1.0) / 2.0
          r -= (H - 1.0) / 2.0
          sigma2 = np.power(sigma, 2.0)
          norm2 = np.power(r, 2.0) + np.power(c, 2.0)
          LoGKernel = (1 / (2*np.pi*sigma2)) * np.exp(-norm2 / (2 * sigma2))
          image_conv = signal.convolve2d(image, LoGKernel, 'same','symm')
      
          return image_conv
      
      def DoG(image, size, sigma, k=1.1):
          Is = gaussConv(image, size, sigma)
          Isk = gaussConv(image, size, sigma*k)
      
          # Is = gaussConv2(image, size, sigma)
          # Isk = gaussConv2(image, size, sigma * k)
      
          doG = Isk - Is
          doG /= (np.power(sigma, 2.0)*(k-1))
          return doG
      
      if __name__ == "__main__":
          img_path= r"C:\Users\silence_cho\Desktop\Messi.jpg"
          img = cv2.imread(img_path, 0)
          sigma = 1
          k = 1.1
          size = (7, 7)
          DoG_edge = DoG(img, size, sigma, k)
          DoG_edge[DoG_edge>255] = 255
          DoG_edge[DoG_edge<0] = 0
          DoG_edge = DoG_edge / np.max(DoG_edge)
          DoG_edge = DoG_edge * 255
          DoG_edge = DoG_edge.astype(np.uint8)
      
          cv2.imshow("img", img)
          cv2.imshow("DoG_edge", DoG_edge)
          cv2.waitKey(0)
          cv2.destroyAllWindows()
      高斯差分(DoG)邊緣檢測(cè)算法

       

       

       

      4. Marri-Hildreth邊緣檢測(cè)算法

        高斯拉普拉斯和高斯差分邊緣檢測(cè),得到邊緣后,只進(jìn)行了簡單的閾值處理,Marr-Hildreth則對(duì)其邊緣進(jìn)行了進(jìn)一步的細(xì)化,使邊緣更加精確細(xì)致,就像Canny對(duì)sobel算子的邊緣細(xì)化一樣。

      Marr-Hildreth邊緣檢測(cè)可以細(xì)分為三步:

      • 構(gòu)建窗口大小為H*W的高斯拉普拉斯卷積核(LoG)或高斯差分卷積核(DoG)

      • 圖形矩陣與LoG核或DoG核卷積

      • 在第二步得到的結(jié)果中,尋找過零點(diǎn)的位置,過零點(diǎn)的位置即為邊緣位置

        第三步可以這么理解,LoG核或DoG核卷積后表示的是二階導(dǎo)數(shù),二階導(dǎo)數(shù)為0表示的是一階導(dǎo)數(shù)的極值,而一階導(dǎo)數(shù)為極值表示的是變化最劇烈的地方,因此對(duì)應(yīng)到圖像邊緣提取中,二階導(dǎo)數(shù)為0,表示該位置像素點(diǎn)變化最明顯,即最有可能是邊緣交接位置。

        對(duì)于連續(xù)函數(shù)g(x), 如果g(x1)*g(x2) < 0,即 g(x1) 和g(x2) 異號(hào),那么在x1,x2之間一定存在x 使得g(x)=0, 則x為g(x)的過零點(diǎn)。在圖像中,Marr-Hildreth將像素點(diǎn)分為下面四種情況,分別判斷其領(lǐng)域點(diǎn)之間是否異號(hào):

       

         python代碼實(shí)現(xiàn)Marri-Hildreth邊緣檢測(cè)算法, 代碼和結(jié)果如下所示:

      #coding:utf-8
      
      import cv2
      import numpy as np
      from scipy import signal
      
      # 二維高斯卷積核拆分為水平核垂直一維卷積核,分別進(jìn)行卷積
      def gaussConv(image, size, sigma):
          H, W = size
          # 先水平一維高斯核卷積
          xr, xc = np.mgrid[0:1, 0:W]
          xc = xc.astype(np.float32)
          xc -= (W-1.0)/2.0
          xk = np.exp(-np.power(xc, 2.0)/(2*sigma*sigma))
          image_xk = signal.convolve2d(image, xk, 'same', 'symm')
      
          # 垂直一維高斯核卷積
          yr, yc = np.mgrid[0:H, 0:1]
          yr = yr.astype(np.float32)
          yr -= (H-1.0)/2.0
          yk = np.exp(-np.power(yr, 2.0)/(2*sigma*sigma))
          image_yk = signal.convolve2d(image_xk, yk, 'same','symm')
          image_conv = image_yk/(2*np.pi*np.power(sigma, 2.0))
      
          return image_conv
      
      def DoG(image, size, sigma, k=1.1):
          Is = gaussConv(image, size, sigma)
          Isk = gaussConv(image, size, sigma*k)
          doG = Isk - Is
          doG /= (np.power(sigma, 2.0)*(k-1))
          return doG
      
      def zero_cross_default(doG):
          zero_cross = np.zeros(doG.shape, np.uint8);
          rows, cols = doG.shape
          for r in range(1, rows-1):
              for c in range(1, cols-1):
                  if doG[r][c-1]*doG[r][c+1] < 0:
                      zero_cross[r][c]=255
                      continue
                  if doG[r-1][c] * doG[r+1][c] <0:
                      zero_cross[r][c] = 255
                      continue
                  if doG[r-1][c-1] * doG[r+1][c+1] <0:
                      zero_cross[r][c] = 255
                      continue
                  if doG[r-1][c+1] * doG[r+1][c-1] <0:
                      zero_cross[r][c] = 255
                      continue
          return zero_cross
      
      def Marr_Hildreth(image, size, sigma, k=1.1):
          doG = DoG(image, size, sigma, k)
          zero_cross = zero_cross_default(doG)
      
          return zero_cross
      
      if __name__ == "__main__":
          img_path= r"C:\Users\silence_cho\Desktop\Messi.jpg"
          img = cv2.imread(img_path, 0)
          k = 1.1
          marri_edge = Marr_Hildreth(img, (11, 11), 1, k)
          marri_edge2 = Marr_Hildreth(img, (11, 11), 2, k)
          marri_edge3 = Marr_Hildreth(img, (7, 7), 1, k)
      
          cv2.imshow("img", img)
          cv2.imshow("marri_edge", marri_edge)
          cv2.imshow("marri_edge2", marri_edge2)
          cv2.imshow("marri_edge3", marri_edge3)
          cv2.waitKey(0)
          cv2.destroyAllWindows()
      Mari-Hildreth邊緣檢測(cè)算法

      posted @ 2020-09-07 23:27  silence_cho  閱讀(8458)  評(píng)論(0)    收藏  舉報(bào)
      主站蜘蛛池模板: 在线看av一区二区三区| 免费无码AV一区二区波多野结衣 | 国产人与禽zoz0性伦多活几年| 精品国产一国产二国产三| 日韩精品射精管理在线观看| 国产极品精品自在线不卡| 亚洲综合小综合中文字幕| 久久精品国产99久久久古代| 亚洲一区二区av偷偷| 国产无吗一区二区三区在线欢| 国产三级精品三级色噜噜| 麻豆国产va免费精品高清在线| 99国产午夜福利在线观看| 99久久亚洲综合精品成人网| 国产精品先锋资源在线看| 亚洲精品一区二区动漫| 熟妇人妻无码中文字幕老熟妇| 阳江市| 亚洲性图日本一区二区三区| 欧美成人一卡二卡三卡四卡| 日韩乱码人妻无码中文字幕视频| 91中文字幕一区二区| 在线观看国产成人av天堂| 久久精品国产亚洲av电影| 亚洲成人一区二区av| 欧美国产精品啪啪| 看亚洲黄色不在线网占| 四虎av永久在线精品免费观看| 国产激情国产精品久久源| 偏关县| 国产日韩av二区三区| 一区二区中文字幕久久| 国产成人一区二区不卡| 国产av亚洲精品ai换脸电影| 国产影片AV级毛片特别刺激| 中文字幕国产日韩精品| 最新日韩精品视频在线| 久久精品蜜芽亚洲国产AV| 久久久久99精品成人片牛牛影视 | 女同性恋一区二区三区视频| 俺也来俺也去俺也射|