Skip to content

Latest commit

 

History

History
47 lines (38 loc) · 1.38 KB

README.md

File metadata and controls

47 lines (38 loc) · 1.38 KB

In CMD run this command lines:

python -m venv env
gxcrawlervenv\Scripts\activate
pip install -r requirements.txt

The database used to store the search results was sqlite. Therefore, it is necessary to create the structure for storage that can be performed by the commands:

python gxcrawler/database.py

Getting Started

Using enviroment variables Windows in CMD execute this commands lines:

set GX_USER=user
set GX_PASSWORD=password
set GX_URL=url
set GX_KBNAME=kbname

Or use file of configuration in .\gxcrawler\.env with the lines with your credentials:

GX_USER=user
GX_PASSWORD=password
GX_URL=url
GX_KBNAME=kbname

Attention: Be careful to make this available in code repositories like gitlab or github, add the .env file in the equivalent of .gitignore

With python installed, run this command for your API to run, the result will be in the project directory with the name of database.db being from the sqlite database. (Virtual environments recommended):

import datetime

from process import Process

process = Process()
date = datetime.datetime(2020, 11, 1, 0, 0, 0, 0)
while date <= datetime.datetime.today():
    process.capture_data(date, date)
    date += datetime.timedelta(days=1)