Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"No tables found" suddenly #55

Closed
Panaslonic opened this issue Jul 2, 2021 · 10 comments
Closed

"No tables found" suddenly #55

Panaslonic opened this issue Jul 2, 2021 · 10 comments

Comments

@Panaslonic
Copy link

Today code like

a = si.get_quote_table ( 'AAPL' )

stop worked. When debug, seems, what it suddenly need html5lib. OK, installed.
Then, when try to run this code, get:

Traceback (most recent call last):

File "C:/Users/Shake/PycharmProjects/RoboAd/Collector.py", line 1811, in
a = si.get_quote_table ( 'AAPL' )
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\yahoo_fin\stock_info.py", line 293, in get_quote_table
tables = pd.read_html(site)
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\pandas\io\html.py", line 1100, in read_html
displayed_only=displayed_only,
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\pandas\io\html.py", line 915, in _parse
raise retained
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\pandas\io\html.py", line 895, in _parse
tables = p.parse_tables()
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\pandas\io\html.py", line 213, in parse_tables
tables = self._parse_tables(self._build_doc(), self.match, self.attrs)
File "C:\Users\Shake.conda\envs\RobAd\lib\site-packages\pandas\io\html.py", line 545, in _parse_tables
raise ValueError("No tables found")
ValueError: No tables found

@Vorotori
Copy link

Vorotori commented Jul 2, 2021

Same here. Seems like it fails at this step:
tables = pd.read_html(site)

Not sure if Yahoo changed the page layout or is it an update of some involved modules?

@hanseopark
Copy link

same issue
urllib.error.HTTPError: HTTP Error 404: Not Found

@atreadw1492
Copy link
Owner

I'm looking into it. It seems to be a change in Yahoo Finance. In the meantime, you should be able to pull most of the data from si.get_quote_table using si.get_quote_data, instead.

@Caesurus
Copy link
Contributor

Caesurus commented Jul 2, 2021

Ran into the same problem. Looks like yahoo is blocking based on useragent.

You can change the code to:
tables = pd.read_html(requests.get(site, headers={'User-agent': 'Mozilla/5.0'}).text)
Just remember to include import requests at the top of the file.

I looked at the latest version in master, and that is not the same code that i have installed via pip, otherwise i would have created a pull request.

It would be great to be able to specify the useragent in the get_options_chain() call itself. If you'd like me to do a PR to the existing code i'd be happy to when i find time.

@atreadw1492
Copy link
Owner

@Caesurus Thanks a lot! I merged the request. I'll push an update to the PyPI version in the next few days.

@Caesurus
Copy link
Contributor

Caesurus commented Jul 2, 2021

Absolutely a pleasure. Thanks for maintaining the module, It's been great using it. So i appreciate it.

@Caesurus
Copy link
Contributor

Caesurus commented Jul 2, 2021

Was just looking through the code and you may need something similar in get_expiration_dates() since that looks like it's doing a .get() as well.

@ywluohao
Copy link

ywluohao commented Jul 2, 2021

@Caesurus @atreadw1492 thank you guys. Can you please also update the function si.get_quote_table?

@atreadw1492
Copy link
Owner

atreadw1492 commented Jul 3, 2021

@ywluohao Yes, just pushed changes to the other functions (including si.get_quote_table) affected to the master branch. I will push the updated version to PyPI after doing additional testing.

@atreadw1492
Copy link
Owner

atreadw1492 commented Jul 3, 2021

Updates have now been pushed to PyPI - the latest version is 0.8.9. Thank you to everyone for alerting and helping with the issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants