-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Notes on future development w/ cfbd
API
#7
Comments
This may be a good API: https://api.collegefootballdata.com/api/docs/?url=/api-docs.json#/. One would have to request a key here though: https://collegefootballdata.com/key. |
I believe this API has a GitHub link: https://github.com/CFBD/cfb-api. They also have more scripts in their repo: https://github.com/CFBD. Python specifics of this API here: https://github.com/CFBD/cfbd-python. This may be a good starting point for coding up the v2 of |
|
def apweeklyurlgenerator( | |
date_list | |
): # date_list = result of dateprocessing() that's in list format (week, year) | |
""" Generate a URL link for a specific week of AP Rankings. Preseason = week 1 """ | |
week = str(date_list[0]) | |
year = str(date_list[1]) | |
staticespn = ( | |
r"http://www.espn.com/college-football/rankings/_/week/6/year/2018/seasontype/2" | |
) | |
currentespnap = r"http://www.espn.com/college-football/rankings" | |
# defaultlink = currentespnap | |
oldurl1 = r"http://www.espn.com/college-football/rankings/_/week/" | |
# Should be the default URL: | |
aponlylinkespn2 = r"http://www.espn.com/college-football/rankings/_/poll/1/week/" | |
defaultlink = aponlylinkespn2 | |
# finallist = ["final", "f", "complete", "total", "last"] | |
# | |
# currentlist = ["current", "present", "default"] | |
# # Format the year correctly | |
# year = str(year) | |
# if len(year) != 4: | |
# if len(year) == 2 and (year[0] == "1" or year[0] == "0"): | |
# # Assume the entry was an abreviation of a year. Add the 20__ before it. | |
# year = "20" + str(year) | |
# | |
# # Week formatting | |
# # Preseason? | |
# week = str(week) | |
# if week.lower() in prelist: | |
# week = "1" | |
# # If the week entered is higher than 16, assume user wants final rankings. | |
# try: | |
# if int(week) > 16: | |
# week = "final" | |
# except: | |
# pass | |
# Generate the URL | |
# Is the week entered indicating the final week? | |
if week.lower() == "final": # in finallist: | |
oldfinalurlexample = "http://www.espn.com/college-football/rankings/_/week/1/year/2017/seasontype/3" | |
week1 = "1/year/" | |
seasontype = "/seasontype/3" | |
url = defaultlink + week1 + year + seasontype | |
# Check for entries wanting the most up-to-date rankings | |
elif week.lower() == "current": # in currentlist: | |
# just use the default link | |
url = defaultlink # default link | |
# # Commented out b/c we want the user to get the results they want and not be confused by getting the current week | |
# # when they wanted another week. This will error out to let them know that. | |
# elif week is None: | |
# # just use the default link by passing | |
# pass | |
else: | |
url2 = r"/year/" | |
url3 = r"/seasontype/2" | |
url = defaultlink + str(week) + url2 + year + url3 | |
print("Week", week, ",", year, "season") | |
return url | |
# Should be the default URL: r"http://www.espn.com/college-football/rankings/_/poll/1/" |
Looks like you tagged me in here along with my gist (possibly accidentally!), but I dug around very quickly on the ESPN API endpoint you pulled out from there for rankings and found this: http://sports.core.api.espn.com/v2/sports/football/leagues/college-football/seasons/2023/types/2/weeks/1/rankings/1?lang=en®ion=us. You can change the A few notes:
Hope this helps! |
@akeaswaran, thank you very much for the head start on the ESPN API! It's most helpful. I am slowly working on a new script using it in an ESPN feature branch that incorporates your notes. |
The espn_cfb_api repo may have some good examples for how to efficiently use the ESPN API for python and retrieve conference information. The main script, espn_cfb_api.py, has useful information as to how to access conference information:
The big drawback is that it is only for the current year, so there is no direct application. This will need to serve as inspiration rather than a plug-and-play source. |
Explore using pydantic for API calls. It can help you with request and response validation, data serialization, and making your API calls more predictable and type-safe. When you're receiving data from a website using requests.get() and you want to parse and validate the response data, Pydantic can still be a valuable tool. Although .json() is convenient for parsing JSON data, Pydantic can provide additional benefits such as data validation and type checking. Here's how you can use Pydantic to read and validate the response from a website:
from pydantic import BaseModel
class WebsiteResponse(BaseModel):
status: int
content: str In this example, we're defining a WebsiteResponse model with two fields: status (for the HTTP status code) and content (for the response content).
import requests
from your_module import WebsiteResponse # Import your Pydantic model
response = requests.get('https://example.com/some_endpoint')
if response.status_code == 200:
website_data = WebsiteResponse(status=response.status_code, content=response.text)
else:
website_data = WebsiteResponse(status=response.status_code, content='')
# Now website_data is a validated Pydantic model Here, we're creating a website_data instance of the WebsiteResponse model and populating it with data from the response. If the response status code is not 200, we provide an empty string for the content field. You can handle error cases according to your needs.
print(website_data.status) # Access the status code
print(website_data.content) # Access the content
|
Commit 6e09e0c represents a working version that replaces BeautifulSoup HTML parsing with sourcing from the ESPN API. |
This is the only place I know where I can leave a note-to-self quickly.
From ChatGPT on rewriting this package with the
cfbd
Python API:The text was updated successfully, but these errors were encountered: