Multiple webscraping activities with the same driver fail after the first webscraping result

The issue comes from this topic.

With selenium I have to scrape data from thee differente webpages for three different dataframes to be shown.

The first dataframe works but the webscapring of the second webpage fails to find the elements in the page and returns a timeout exception (timeout is 10s).

File "/home/adminuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/", line 534, in _run_script
    exec(code, module.__dict__)
File "/mount/src/corners-betting/", line 102, in <module>
    team_corners = single_team(code, team)
File "/mount/src/corners-betting/", line 94, in single_team
    team_corners_table = pd.merge(corners_for(), corners_against(), left_index=True, right_index=True, suffixes=('', '_y'))
File "/mount/src/corners-betting/", line 68, in corners_for
    corners_for_team_table = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.ID, 'matchlogs_for')))
File "/home/adminuser/venv/lib/python3.9/site-packages/selenium/webdriver/support/", line 101, in until
    raise TimeoutException(message, screen, stacktrace)


  1. is a custom webscraping module built on top of selenium that works in a local runtime
  2. the webpage to be scraped is Atalanta Match Logs (Pass Types), Serie A | where “922493f3” is the code and “Atalanta” the team parameters of the function corners_for()

I can’t get why does the driver loads the page but then can’t find webelements. I also tried:

  1. with a totally different website and still the same result is obtained.
  2. I tried to invert the order of the dataframes (and so the scraping processess)
  3. I tried to close the driver and recreate a new driver variable before each driver call

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.