I am looking to implement method chaining of Selenium WebDriverWaits.
To start with, this block of code implementing a single WebDriverWait works perfect:
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
options = webdriver.ChromeOptions()
options.add_argument("start-maximized")
options.add_argument('disable-infobars')
driver = webdriver.Chrome(chrome_options=options, executable_path=r'C:\Utility\BrowserDrivers\chromedriver.exe')
driver.get('https://www.facebook.com')
element = WebDriverWait(driver, 5).until(lambda x: x.find_element_by_xpath("//input[@id='email']"))
element.send_keys("method_chaining")
As per as my current requirement I have to implement chaining of two WebDriverWait instances as the idea is to fetch the element returned from the first WebDriverWait as an input to the (chained) second WebDriverWait.
To achieve this, I followed the discussion method chaining in python tried to use the use Python's lambda function through method chaining using Pipe - Python library to use infix notation in Python.
Here is my code trial:
from pipe import *
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = webdriver.ChromeOptions()
options.add_argument("start-maximized")
options.add_argument('disable-infobars')
driver = webdriver.Chrome(chrome_options=options, executable_path=r'C:\WebDrivers\chromedriver.exe')
driver.get('https://www.facebook.com')
element = WebDriverWait(driver,15).until((lambda driver: driver.find_element_by_xpath("//input[@id='email']"))
| where(lambda driver: driver.find_element_by_css_selector("table[role='presentation']")))
element.send_keys("method_chaining")
I am seeing an error as:
DevTools listening on ws://127.0.0.1:52456/devtools/browser/e09c1d5e-35e3-4c00-80eb-cb642fa273ad
Traceback (most recent call last):
File "C:\Users\Soma Bhattacharjee\Desktop\Debanjan\PyPrograms\python_pipe_example.py", line 24, in <module>
| where(lambda driver: driver.find_elements(By.CSS_SELECTOR,"table[role='presentation']")))
File "C:\Python\lib\site-packages\pipe.py", line 58, in __ror__
return self.function(other)
File "C:\Python\lib\site-packages\pipe.py", line 61, in <lambda>
return Pipe(lambda x: self.function(x, *args, **kwargs))
File "C:\Python\lib\site-packages\pipe.py", line 271, in where
return (x for x in iterable if (predicate(x)))
TypeError: 'function' object is not iterable
Followed the following discussions:
Still no clue what I am missing.
Can someone guide me where I am going wrong?
Edit:
I don't know if it meet your requirement, but we need to create custom construct.
@Pipe
def where(i, p):
if isinstance(i, list):
return [x for x in i if p(x)]
else:
return i if p(i.find_element_by_xpath('//ancestor::*')) else None
element = WebDriverWait(driver, 5).until(lambda x: x.find_element_by_xpath("//input[@id='email']") \
| where(lambda y: y.find_element_by_css_selector("table[role='presentation']")))
element.send_keys("method_chaining")
old:
I'm not an expert but I think maybe you misunderstand with how pipe work, by default it work to process previous value, for example
original | filter = result
[a,b,c] | select(b) = b
what you want maybe is and operator
WebDriverWait(driver,15).until(
lambda driver: driver.find_element(By.CSS_SELECTOR,"table[role='presentation']")
and driver.find_element(By.XPATH,"//input[@id='email']"))
or selenium ActionChains but no wait method, we need to extend it
from selenium.webdriver import ActionChains
class Actions(ActionChains):
def wait(self, second, condition):
element = WebDriverWait(self._driver, second).until(condition)
self.move_to_element(element)
return self
Actions(driver) \
.wait(15, EC.presence_of_element_located((By.CSS_SELECTOR, "table[role='presentation']"))) \
.wait(15, EC.element_to_be_clickable((By.XPATH, "//input[@id='email']"))) \
.click() \
.send_keys('method_chaining') \
.perform()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With