If you have a lot of accounts and want this fully automated with: reposting, running 24/7, scheduled tasks & more without a single input from you, get in touch with me below:
- Discord: grupii
- Telegram: https://t.me/grupiiiii
A Python-based tool for automating Twitter login and scraping group chat data using Playwright.
This project provides a set of utilities for Twitter automation, including:
- Automated login with support for various authentication scenarios
- Group chat data extraction
- User profile information collection
- MongoDB integration for data storage
- Headless Browser Automation: Uses Playwright for reliable browser automation
- Authentication Handling: Manages login, 2FA, and verification challenges
- API Response Capture: Intercepts network requests to extract group chat data
- Data Persistence: Stores authentication tokens, cookies, and scraped data in MongoDB
- Participant Tracking: Maps users to group chats with rich profile information
- Python 3.7+
- MongoDB
- Playwright
- AsyncIO
- Clone the repository
git clone https://github.com/yourusername/tw1.git
cd tw1
- Install dependencies
pip install -r requirements.txt
playwright install chromium
- Make sure your .env file is in the repo and contains the MONGO_URI pointed to your MongoDB cluster (local or cloud)
MONGO_URI="mongodb://127.0.0.1:27017/"
The login tool authenticates with Twitter and saves credentials for future use. (run this first to save cookies & auth)
python login.py -u your_username
Parameters:
-u, --username
: Twitter username-p, --password
: Twitter password (optional, will prompt if not provided)--proxy
: Proxy in format protocol://user:pass@host:port--headless
: Run browser in headless mode (not recommended)
Extracts group chat data and user information from Twitter's messaging endpoints. You'll need to run this multiple times, since the Twitter API provides only so much of the groupchats, and it also will contain groupchats which you've been invited to but haven't accepted. (it doesn't send a message to the "untrusted" groups, they need to be "trusted" first)
python scraper.py -u your_username
Parameters:
-u, --username
: Twitter username to use for scraping--headless
: Run browser in headless mode (not recommended)
Automatically sends messages to trusted Twitter group chats.
python messaging.py -u your_username -t templates.json
Either setup a template or update the list in line 13.
Parameters:
-u, --username
: Twitter username to use for sending messages-g, --groups
: Specific group IDs to message (optional)-t, --templates
: Path to JSON file with message templates
- Conversation ID
- Group name
- Creation time and creator
- Trust status
- Participant information
- Timestamps for data collection
- User ID and screen name
- Profile information (name, description, image URLs)
- Account statistics (followers, following, tweet count)
- Platform engagement data
tw1/
├── login.py # Twitter authentication
├── scraper.py # Group chat extraction
├── messaging.py # Automated messaging
├── db/
│ └── connection.py # MongoDB connection utilities
├── utils/
│ └── proxyparser.py # Proxy configuration parser
└── templates/
└── messages.json # Sample message templates
- First-time login with credentials:
python login.py -u your_username
- Scrape group chats:
python scraper.py -u your_username
- Send messages to all trusted groups:
python messaging.py -u your_username
Create a JSON file with message templates:
[
"hit pinned please and add me to gif groups",
"please don't skip, hit my pinned and recent please i check",
"check out my latest content and hit pinned/recent"
]
Then use it with the messaging tool:
python messaging.py -u your_username -t your_templates.json
MIT License
This tool is for educational purposes only. Use responsibly and in accordance with Twitter's Terms of Service.