Skip to content

Datura-ai/smart-scrape

Repository files navigation

Bittensor Smart-Scrape

License: MIT

Introduction

Bittensor Smart-Scrape: Streamlining Twitter Data Analysis on Subnet 22

Welcome to Smart-Scrape, a cutting-edge tool hosted on the Bittensor network, designed for effective and simplified analysis of Twitter data. This tool is ideal for researchers, marketers, and data analysts who seek to extract insightful information from Twitter with ease.

Key Features

  • AI-Powered Analysis: Harnesses artificial intelligence to delve into Twitter data, providing deeper insights into user interactions.
  • Real-Time Data Access: Connects directly with Twitter's database for the latest information.
  • Sentiment Analysis: Determines the emotional tone of tweets, aiding in understanding public sentiment.
  • Metadata Analysis: Dives into tweet details like timestamps and retweet counts for a comprehensive view.
  • Time-Efficient: Minimizes manual data sorting, saving valuable research time.
  • User-Friendly Design: Suitable for both beginners and experts.

Advantages

  • Decentralized Platform: Ensures reliability through its placement on the Bittensor network.
  • Customizability: Tailors data analysis to meet specific user requirements.
  • Informed Decision-Making: Facilitates data-driven strategies.
  • Versatility: Applicable for diverse research fields, from market analysis to academic studies.

Installation

Requirements: Python 3.8 or higher

  1. Clone the repository:
    git clone https://github.com/surcyf123/smart-scrape.git
  2. Install the requirements:
    cd smart-scrape
    python -m pip install -r requirements.txt
    python -m pip install -e .

Preparing Your Environment

Before running a miner or validator, ensure to:

Environment Variables Configuration

For setting up the necessary environment variables for your miner or validator, please refer to the Environment Variables Guide.

Running the Miner

python -m neurons/miners/miner.py 
    --netuid 22
    --subtensor.network finney
    --wallet.name <your miner wallet>
    --wallet.hotkey <your validator hotkey>
    --axon.port 14000

Running the Validator API with Automatic Updates

These validators are designed to run and update themselves automatically. To run a validator, follow these steps:

  1. Install this repository, you can do so by following the steps outlined in the installation section.
  2. Install Weights and Biases and run wandb login within this repository. This will initialize Weights and Biases, enabling you to view KPIs and Metrics on your validator. (Strongly recommended to help the network improve from data sharing)
  3. Install PM2 and the jq package on your system. On Linux:
    sudo apt update && sudo apt install jq && sudo apt install npm && sudo npm install pm2 -g && pm2 update
    On Mac OS
    brew update && brew install jq && brew install npm && sudo npm install pm2 -g && pm2 update
  4. Run the run.sh script which will handle running your validator and pulling the latest updates as they are issued.
    pm2 start run.sh --name smart_scrape_validators_autoupdate -- --wallet.name <your-wallet-name> --wallet.hotkey <your-wallet-hot-key>

This will run two PM2 process: one for the validator which is called smart_scrape_validators_main_process by default (you can change this in run.sh), and one for the run.sh script (in step 4, we named it smart_scrape_validators_autoupdate). The script will check for updates every 30 minutes, if there is an update then it will pull it, install it, restart smart_scrape_validators_main_process and then restart itself.

Detailed Setup Instructions

For step-by-step guidance on setting up and running a miner, validator, or operating on the testnet or mainnet, refer to the following guides:


Real-time Monitoring with wandb Integration

The text prompting validator sends data to wandb, allowing real-time monitoring with key metrics like:

  • Gating model loss
  • Hardware usage
  • Forward pass time
  • Block duration

Data is publicly available at this link. Note that data from anonymous users is deleted after 7 days.