I'm making an effort to download every image from various websites. Unfortunately, there are some issues with my code. The website which I want to scrape has direct my browser to another page with the headline "Checking if site connection is secure selenium" when I use Selenium to access it. And despite several reloads, Selenium is still inaccessible. I don't know how to change it because I'm a rookie. Because of my poor English, I can't locate the solution on Google. I've tried to install a Firefox profile, even though I'm unsure of its purpose, but the answer is still no. Please assist me! My code:
options = Options()
options.binary_location = r"C:\Program Files\Mozilla Firefox\firefox.exe"
options.headless = True
driver = webdriver.Firefox(service=Service("D:\Program\geckodriver\geckodriver.exe"), options=options)
driver.implicitly_wait(10)
driver.get(url)
what works for me in selenium 4.11 is,
options = selenium.webdriver.chrome.options.Options()
options.add_argument('ignore-certificate-errors')
options.add_argument('--ignore-ssl-errors=yes')
chrome_browser = webdriver.Chrome(options=options)
chrome_browser.get('https://demo.seleniumeasy.com/basic-first-form-
demo.html')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With