Fix Python – Scraping: SSL: CERTIFICATE_VERIFY_FAILED error for


Asked By – Catherine4j

I’m practicing the code from ‘Web Scraping with Python’, and I keep having this certificate problem:

from urllib.request import urlopen 
from bs4 import BeautifulSoup 
import re

pages = set()
def getLinks(pageUrl):
    global pages
    html = urlopen(""+pageUrl)
    bsObj = BeautifulSoup(html)
    for link in bsObj.findAll("a", href=re.compile("^(/wiki/)")):
        if 'href' in link.attrs:
            if link.attrs['href'] not in pages:
                #We have encountered a new page
                newPage = link.attrs['href'] 

The error is:

  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/urllib/", line 1319, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1049)>

Btw,I was also practicing scrapy, but kept getting the problem: command not found: scrapy (I tried all sorts of solutions online but none works… really frustrating)

Now we will see solution for issue: Scraping: SSL: CERTIFICATE_VERIFY_FAILED error for


Once upon a time I stumbled with this issue. If you’re using macOS go to Macintosh HD > Applications > Python3.6 folder (or whatever version of python you’re using) > double click on “Install Certificates.command” file. 😀

This question is answered By – Jey Miranda

This answer is collected from stackoverflow and reviewed by FixPython community admins, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0