I am putting this here more as a reference since this error appeared a couple of time when using urllib library.
For example there is the classical case of
from urllib.request import urlopen
from bs4 import BeautifulSoup
html = urlopen('http://en.wikipedia.org/wiki/Kevin_Bacon')
bs = BeautifulSoup(html, 'html.parser')
for link in bs.find_all('a'):
if 'href' in link.attrs:
print(link.attrs['href'])
And if you run it for the first time in your Jupyter Kernel, it will return
URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)>
The easiest way to fix it is by adding two extra lines to the code
from urllib.request import urlopen
import ssl
from bs4 import BeautifulSoup
ssl._create_default_https_context = ssl._create_unverified_context
html = urlopen('http://en.wikipedia.org/wiki/Kevin_Bacon')
bs = BeautifulSoup(html, 'html.parser')
for link in bs.find_all('a'):
if 'href' in link.attrs:
print(link.attrs['href'])
The first one is to import the ssl library and the second one to actually create an unverified context.
The interesting fact is that once this is loaded to the kernel, it will actually work even if you comment the lines and re-execute it.
That would be all.
Cheers!