Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse-engineered. No API rate limits. No tokens needed. No restrictions. Extremely fast.
You can use this library to get the text of any user's Tweets trivially.
go get -u github.com/gay-wigger/twitter-scraper
Now all methods require authentication!
err := scraper.Login("username", "password")
Use username to login, not email! But if you have email confirmation, use email address in addition:
err := scraper.Login("username", "password", "email")
If you have two-factor authentication, use code:
err := scraper.Login("username", "password", "code")
Status of login can be checked with:
scraper.IsLoggedIn()
Logout (clear session):
scraper.Logout()
If you want save session between restarts, you can save cookies with scraper.GetCookies()
and restore with scraper.SetCookies()
.
For example, save cookies:
cookies := scraper.GetCookies()
// serialize to JSON
js, _ := json.Marshal(cookies)
// save to file
f, _ = os.Create("cookies.json")
f.Write(js)
and load cookies:
f, _ := os.Open("cookies.json")
// deserialize from JSON
var cookies []*http.Cookie
json.NewDecoder(f).Decode(&cookies)
// load cookies
scraper.SetCookies(cookies)
// check login status
scraper.IsLoggedIn()
If you don't want to use your account, you can try login as a Twitter app:
err := scraper.LoginOpenAccount()
package main
import (
"context"
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
err := scraper.LoginOpenAccount()
if err != nil {
panic(err)
}
for tweet := range scraper.GetTweets(context.Background(), "Twitter", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
It appears you can ask for up to 50 tweets.
package main
import (
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
err := scraper.Login(username, password)
if err != nil {
panic(err)
}
tweet, err := scraper.GetTweet("1328684389388185600")
if err != nil {
panic(err)
}
fmt.Println(tweet.Text)
}
Now the search only works for authenticated users!
Tweets containing “twitter” and “scraper” and “data“, filtering out retweets:
package main
import (
"context"
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
err := scraper.Login(username, password)
if err != nil {
panic(err)
}
for tweet := range scraper.SearchTweets(context.Background(),
"twitter scraper data -filter:retweets", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
The search ends if we have 50 tweets.
See Rules and filtering for build standard queries.
scraper.SetSearchMode(twitterscraper.SearchLatest)
Options:
twitterscraper.SearchTop
- default modetwitterscraper.SearchLatest
- live modetwitterscraper.SearchPhotos
- image modetwitterscraper.SearchVideos
- video modetwitterscraper.SearchUsers
- user mode
package main
import (
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
scraper.LoginOpenAccount()
profile, err := scraper.GetProfile("Twitter")
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", profile)
}
package main
import (
"context"
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New().SetSearchMode(twitterscraper.SearchUsers)
err := scraper.Login(username, password)
if err != nil {
panic(err)
}
for profile := range scraper.SearchProfiles(context.Background(), "Twitter", 50) {
if profile.Error != nil {
panic(profile.Error)
}
fmt.Println(profile.Name)
}
}
package main
import (
"fmt"
twitterscraper "github.com/gay-wigger/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
trends, err := scraper.GetTrends()
if err != nil {
panic(err)
}
fmt.Println(trends)
}
Support HTTP(s) and SOCKS5 proxy
err := scraper.SetProxy("http://localhost:3128")
if err != nil {
panic(err)
}
err := scraper.SetProxy("socks5://localhost:1080")
if err != nil {
panic(err)
}
Add delay between API requests (in seconds)
scraper.WithDelay(5)
scraper.WithReplies(true)