Online Twitter Basics

This notebook goes step-by-step through logging onto Twitter, live, retrieving tweets by keyword and analyzing them.

My thanks to Bill Howe of the University of Washington for his Coursera course Introduction to Data Analysis and Matthew A. Russell for his book Mining the Social Web for getting me started.

Note 1: I run the IPython notebook in pylab=inline mode which makes it look like Matlab; this may mean that you will have to explicitly import some modules that I take for granted. Alternatively, you could configure yourself for pylab, which I obviously think is better.

Note 2: Twitter very popular outside the United States. Many non-English languages have codesets that I don't handle properly leading to program failures. Keep calm and carry on: at worst, you'll have to restart the kernel.

Logon to Twitter

I have my Twitter credentials in a file twitter_credential.py ... you need to create your own as follows:

def twitter_credentials():
    """
    Generate your own Twitter credentials here:  https://apps.twitter.com/
    """
    api_key = " "
    api_secret = " "

    access_token_key = " "
    access_token_secret = " "

    return (api_key, api_secret, access_token_key, access_token_secret)

If you get a message similar to <twitter.api.Twitter object at 0x000000000E79AAC8>, you've succesfully signed on


In [1]:
import twitter_credentials
reload(twitter_credentials)
from twitter_credentials import twitter_credentials # my credentials
token, token_secret, consumer_key, consumer_secret = twitter_credentials()
#consumer_key, consumer_secret, token, token_secret = twitter_credentials()

import oauth2 as oauth
import urllib2 as urllib
import twitter

auth = twitter.oauth.OAuth(consumer_key, consumer_secret, token, token_secret)

twitter_api = twitter.Twitter(auth=auth)
print twitter_api


<twitter.api.Twitter object at 0x000000000E82CA90>

QT Console

I like to run the qtconsole whenever I am using a notebook because it allows me to experiment on a command-line interface to the notebook's session, with all its current variables, etc.


In [2]:
%qtconsole

twitter_functions.py

I have a file twitter_functions.py that contains a number of utility functions. Find it on my GitHub repo: grfiv/healthcare_twitter_analysis

You also need to have the AFINN-111.txt sentiment-word file installed.

A note on AFINN-111.txt: the file has a tab delimiter separating words and phrases and their score. You can add your own n-grams as long as you retain the formatting. I strip out extra white space in phrases so you don't have to be exact in that regard. The sentiment score is simply the sum of the scores of any n-grams found in the text after urls, hashtags and users have been removed.

Make a simple query

Modify the word or phrase in q = " ... ". Only one term.
Try changing num_tweet_request, too.


In [3]:
import sys

num_tweet_request = 500

q = "Lung Cancer"

try:
    treturn = twitter_api.search.tweets(q=q, count=num_tweet_request)
except:
    print sys.exc_info()
initial_results = treturn['statuses']

Filter by retweet_count

The idea is that the the best tweets are ones that have been retweeted. So we will keep only those tweets that have a retweet_count greater than some threshold.

favorite_count is also supposed to have some bearing on importance but it's usually zero in the ones I've looked at.


In [4]:
retweet_threshold = 25

results = [tweet 
           for tweet in initial_results 
           if tweet['retweet_count'] > retweet_threshold]

num_tweets = len(results)
print "%d tweets received"%len(initial_results)
print "%d tweets after retweet filtering"%len(results)


100 tweets received
8 tweets after retweet filtering

Display the raw json returned for one of the num_tweets results we asked for


In [ ]:
# I commented this out because the list is very long
# However, to do any serious Twitter munging, you need to know all about this thing

#print json.dumps(results[1], indent=4)

Parse the 'text' field, which is the body of the tweet, for each of the num_tweets responses returned

I use my utility function parse_tweet_text for the body of the tweet itself; it also returns the AFINN sentiment, for which you must have AFINN-111.txt installed. I parse the json directly for the other fields.

A response of [] means nothing was returned for a particular 'entity'.


In [6]:
from twitter_functions import parse_tweet_text 

senders = []
for i in range(num_tweets):
    senders.append(results[i]['user']['name'])
    
    tweet_text = results[i]['text']
    words, hashes, users, urls = parse_tweet_text(tweet_text)

    print "the actual text of results[%d]['text']\nretweet_count  = %d\nfavorite_count = %d\n...................\n%s"%(i,results[i]['retweet_count'],results[i]['favorite_count'],tweet_text)
    print "\nthe list of words %s"%[word.encode('utf-8') for word in words]
    print "hashtags referenced %s"%[hash_.encode('utf-8') for hash_ in hashes]
    print "twitter users mentioned %s"%[user.encode('utf-8') for user in users]
    print "URLs %s"%[url.encode('utf-8') for url in urls]
    print "............................"
    print "description: %s"%results[i]['user']['description']
    print "followers_count: %d"%results[i]['user']['followers_count']
    print "friends_count: %d"%results[i]['user']['friends_count']
    print "favourites_count: %d"%results[i]['user']['favourites_count']
    print "location: %s"%results[i]['user']['location']
    print "time_zone: %s"%results[i]['user']['time_zone']
    print "screen_name: %s"%results[i]['user']['screen_name']
    print "name: %s"%results[i]['user']['name']
    if results[i]['place'] is not None: print "place: %s"%results[i]['place']
    print "====================================================================================\n"


the actual text of results[0]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: do breast because you to be streamlined?
followers_count: 49
friends_count: 188
favourites_count: 31278
location: the stank pit
time_zone: Arizona
screen_name: Blosnop
name: cool mom
====================================================================================

the actual text of results[1]['text']
retweet_count  = 143
favorite_count = 0
...................
RT @WATNCelebrity: RIP Grandma Betty

Instagram's favorite nan has passed away following her 8 month battle with lung cancer aged 80. http:…

the list of words ['rt', 'rip', 'grandma', 'betty', "instagram's", 'favorite', 'nan', 'has', 'passed', 'away', 'following', 'her', '8', 'month', 'battle', 'with', 'lung', 'cancer', 'aged', '80', 'http:\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['watncelebrity']
URLs []
............................
description: Passion, pride, pursuit of excellence; only part of what makes a Trojan. Trojans never die. The eternal ability to Fight On.
followers_count: 789
friends_count: 969
favourites_count: 2431
location: 
time_zone: None
screen_name: USCgoldstandard
name: LordTrojan
====================================================================================

the actual text of results[2]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: | Fayto | 23 | ♋ | INFP | Canada | Misanthropic Meganekko | NEET Hikikomori | Dorky Fangirl |
followers_count: 194
friends_count: 184
favourites_count: 11538
location: Saturn's rings
time_zone: Central Time (US & Canada)
screen_name: mistrene
name: egg apologist
====================================================================================

the actual text of results[3]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: Strange lesbian gender blob. I own a cat in the interest of sounding like a real human being. I like Pokemon and Loud Gay Angst. They/them (prefer) and she/her
followers_count: 25
friends_count: 112
favourites_count: 192
location: 
time_zone: Atlantic Time (Canada)
screen_name: nightmareesh
name: Zoe☆
====================================================================================

the actual text of results[4]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: Gone's the name and pixel art's the game. also computers. and games. and cooking - King of Fast - XNTJ - Washington Cutie's hiPS THO
followers_count: 910
friends_count: 781
favourites_count: 43936
location: Prospit and Lweasboon
time_zone: Lisbon
screen_name: GoneSiIver
name: Classy Mr. Papa John
====================================================================================

the actual text of results[5]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: Pet my quiff and call me Fuyutsexy it's hot in here. Peppermint supplier and Vice commander of NERV. I bet you want my special K. I am daddy Fuyu to everyone.UK
followers_count: 311
friends_count: 240
favourites_count: 21912
location: Not contained by NERV
time_zone: London
screen_name: Kozo_Fuyutsexy
name: Actually Fuyutsuki
====================================================================================

the actual text of results[6]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: follow for more. not sure what but there will be more of it
followers_count: 501
friends_count: 237
favourites_count: 23313
location: ⋆⋆⋆✩⋆⋆⋆
time_zone: Eastern Time (US & Canada)
screen_name: pupzhii
name: ⋆izzy⋆
====================================================================================

the actual text of results[7]['text']
retweet_count  = 359
favorite_count = 0
...................
RT @drwgmawr: [At the hospital, dying]
You've got lung cancer. Probably from smoking.
'Did I...look...cool...?'
Yes...you looked awesome.
*…

the list of words ['rt', '[at', 'the', 'hospital', 'dying]', "you've", 'got', 'lung', 'cancer', 'probably', 'from', 'smoking', "'did", "i...look...cool...?'", 'yes...you', 'looked', 'awesome', '*\xe2\x80\xa6']
hashtags referenced []
twitter users mentioned ['drwgmawr']
URLs []
............................
description: kanii / 15 / f (cis) / she/her / pan / albanian/kosovar || matching icons with @strideres || too gay to function w/ @NeighChuify || never f(ieri)orget
followers_count: 959
friends_count: 754
favourites_count: 9159
location: windsor, ontario, canada
time_zone: Eastern Time (US & Canada)
screen_name: kawoswag
name: kan(ii)ye west 
====================================================================================

twitter_functions.py:303: UnicodeWarning: Unicode equal comparison failed to convert both arguments to Unicode - interpreting them as being unequal
  if word in ['.',':','!',',',';',"-","-","?",'\xe2\x80\xa6',"!","|",'"','~','..','/','&','<','>']: continue

Let's do a quick lexical analysis of these tweets

Step 1: Collect all the tweets in one string

Note: the AFINN score will be for the entire string, which may or may not be helpful


In [9]:
tweets = ""
for i in range(num_tweets):
    tweets = tweets + " " + results[i]['text']

words, hashes, users, urls = parse_tweet_text(tweets)

Step 2: Remove the stop words

Note: English stop words won't help in Espanish


In [10]:
from nltk.corpus import stopwords

stop = stopwords.words('english')
stop.append("it's")
stop.append('w/')

filtered_word_list = [word for word in words if word not in stop]
filtered_word_set  = set(filtered_word_list)

print "%d  total words"%len(filtered_word_list)
print "%d unique words"%len(filtered_word_set)


130  total words
31 unique words

Step 3: Show the top 10 of each entity type

Note: 'rt' means retweet and it's very common, but you may not consider it a word


In [11]:
from collections import Counter
from prettytable import PrettyTable

for label, data in (('Word', filtered_word_list), 
                    ('Senders', senders), 
                    ('Users Mentioned', users),
                    ('Hashtag', hashes),
                    ('URL', urls)):
    pt = PrettyTable(field_names=[label, 'Count']) 
    c = Counter(data)
    [ pt.add_row(kv) for kv in c.most_common()[:10] ]
    pt.align[label], pt.align['Count'] = 'l', 'r' # Set column alignment
    print pt


+----------------------+-------+
| Word                 | Count |
+----------------------+-------+
| cancer               |     8 |
| lung                 |     8 |
| rt                   |     8 |
| yes...you            |     7 |
| awesome              |     7 |
| hospital             |     7 |
| probably             |     7 |
| i...look...cool...?' |     7 |
| got                  |     7 |
| you've               |     7 |
+----------------------+-------+
+----------------------+-------+
| Senders              | Count |
+----------------------+-------+
| egg apologist        |     1 |
| kan(ii)ye west       |     1 |
| Zoe☆                 |     1 |
| Actually Fuyutsuki   |     1 |
| LordTrojan           |     1 |
| ⋆izzy⋆               |     1 |
| cool mom             |     1 |
| Classy Mr. Papa John |     1 |
+----------------------+-------+
+-----------------+-------+
| Users Mentioned | Count |
+-----------------+-------+
| drwgmawr        |     7 |
| watncelebrity   |     1 |
+-----------------+-------+
+---------+-------+
| Hashtag | Count |
+---------+-------+
+---------+-------+
+-----+-------+
| URL | Count |
+-----+-------+
+-----+-------+

Step 4: Lexical Diversity


In [12]:
from twitter_functions import lexical_diversity, average_words

print "Lexical Diversity"
print "words  %0.2f"%lexical_diversity(set(words), words) # words not filtered_word_list
print "hashes %0.2f"%lexical_diversity(set(hashes), hashes)
print "users  %0.2f"%lexical_diversity(set(users), users)
print "urls   %0.2f"%lexical_diversity(set(urls), urls)

print "\nAverage per tweet"
print "words           %0.2f"%average_words(words, num_tweets)
print "filtered words  %0.2f"%average_words(filtered_word_list, num_tweets)
print "hashes          %0.2f"%average_words(hashes, num_tweets)
print "users           %0.2f"%average_words(users, num_tweets)
print "urls            %0.2f"%average_words(urls, num_tweets)


Lexical Diversity
words  0.24
hashes 0.00
users  0.25
urls   0.00

Average per tweet
words           18.38
filtered words  16.25
hashes          0.00
users           1.00
urls            0.00

Step 5: Can't finish without a word cloud, right?

Note 1: foreign-language codes often cause this to fail.

Note 2: the lexical density determines the look of the result: a few words in a mass of many infrequent words makes for a more attractive picture. Playing with the numerical parameter to num_tags also has an effect.


In [13]:
from pytagcloud import create_tag_image, make_tags
import IPython.display

# 'rt' is very common and useless
filter_ex_rt = list(filtered_word_list)
while 'rt' in filter_ex_rt: filter_ex_rt.remove('rt')

for item in [filter_ex_rt]:
    c = Counter(item)

num_tags = min(150, len(c.most_common())-1)
tags     = make_tags(c.most_common()[:num_tags], maxsize=120)

create_tag_image(tags, 'wordcloud.png', size=(900,900), fontname='Lobster' )

IPython.display.display(IPython.display.Image(filename='wordcloud.png'))


Look at the json for the project

Read in one of the json files and create a list of all the tweets


In [15]:
import json
tweet_file = open("../files/bigtweet_file002.json", "r")
tweet_list = [json.loads(str(line)) for line in tweet_file]

len(tweet_list)


Out[15]:
1244

Filter: unique tweets, retweeted more than retweet_threshold times


In [16]:
text_set          = set()
popular_tweets    = []
retweet_threshold = 199

for tweet in tweet_list:
    if tweet['retweet_count'] > retweet_threshold and tweet['text'] not in text_set:
        text_set.add(tweet['text'])
        popular_tweets.append(tweet)
        
len(popular_tweets)


Out[16]:
0

In [17]:
for tweet in popular_tweets:
    print "retweets: %d; user name: %s; screen name: %s\ntext: %s\n"%\
    (tweet['retweet_count'],tweet['user']['name'],tweet['user']['screen_name'],tweet['text'])

Let's look up WorldAIDSDayUS


In [18]:
import sys
try:
    users = twitter_api.users.lookup(screen_name='WorldAIDSDayUS')
except:
    emsg = sys.exc_info()
    print emsg
user = users[0]

In [119]:
print user['name']
print user['description']
print user["location"]
print user["followers_count"]
print user["friends_count"]
print user["statuses_count"]


World AIDS Day
Be a supporter: Follow us & RT any tweets that you connect with. You can also talk to us; we'd love that too. #worldaidsdayus
Detroit, MI
342
314
421

Let's find all the followers of World AIDS Day, screen name WorldAIDSDayUS


In [19]:
import sys
import json
try:
    followers = twitter_api.followers.ids(screen_name='WorldAIDSDayUS')
except:
    emsg = sys.exc_info()[1]
    print emsg.e
    print emsg.response_data
print json.dumps(followers,indent=4)


{
    "next_cursor_str": "0", 
    "previous_cursor": 0, 
    "ids": [
        195389374, 
        2675359034, 
        764200507, 
        82935993, 
        128667281, 
        2283671491, 
        2647943364, 
        1541453431, 
        105530298, 
        1376777845, 
        1193386063, 
        2577655255, 
        101872680, 
        808810952, 
        373209275, 
        20767418, 
        546427350, 
        1376745775, 
        611960552, 
        2469609312, 
        2407694624, 
        2472881323, 
        185576252, 
        1465216634, 
        605535912, 
        2285714510, 
        509263399, 
        2463488694, 
        2278096644, 
        2421174531, 
        2293387526, 
        2456825107, 
        229553670, 
        2450012678, 
        197543826, 
        226662318, 
        44004125, 
        17272735, 
        2360215847, 
        1534563864, 
        321466849, 
        199358449, 
        52559650, 
        2442629124, 
        35099519, 
        88258912, 
        91298698, 
        21405294, 
        2353286928, 
        351045168, 
        16443199, 
        27330602, 
        626333479, 
        176418075, 
        23273418, 
        2313689048, 
        296945357, 
        574738274, 
        298150971, 
        518744362, 
        136337303, 
        240814238, 
        31185679, 
        587821634, 
        867925855, 
        589656990, 
        28697382, 
        29282292, 
        459647073, 
        48619626, 
        33855028, 
        119028262, 
        473363659, 
        1493052828, 
        378595093, 
        486760024, 
        484399952, 
        165061127, 
        614394728, 
        16351985, 
        329159493, 
        588035925, 
        16733220, 
        734295223, 
        2331191636, 
        37517729, 
        543100042, 
        709410859, 
        173618370, 
        72850869, 
        359700577, 
        219844156, 
        445948611, 
        1070098525, 
        39804332, 
        778361120, 
        336395471, 
        394168530, 
        71670623, 
        517074349, 
        1862614951, 
        197620898, 
        212675564, 
        61380690, 
        2414851572, 
        2382843866, 
        139293290, 
        2367546340, 
        1286286968, 
        569535555, 
        2358644534, 
        2352918288, 
        28184011, 
        166684510, 
        2307892154, 
        2289217188, 
        2288700246, 
        28874028, 
        1525342994, 
        2270301583, 
        2241275954, 
        930243738, 
        42659389, 
        505254241, 
        2221219284, 
        2178136674, 
        2211750831, 
        2232324578, 
        20125535, 
        1617908406, 
        2220246115, 
        126729291, 
        21788991, 
        1961252232, 
        2226965178, 
        28427014, 
        2159289018, 
        1624891369, 
        424939667, 
        47998954, 
        1711482636, 
        495283643, 
        159113107, 
        34280715, 
        2159367186, 
        307103173, 
        1859560908, 
        46041178, 
        598083936, 
        223475297, 
        142843687, 
        20357537, 
        1355842375, 
        1594391604, 
        2208999444, 
        517693044, 
        382266933, 
        89404689, 
        1606449200, 
        88827567, 
        2176283142, 
        359189186, 
        556027829, 
        2156296571, 
        185382579, 
        1752007092, 
        1367365358, 
        1588670383, 
        1004677340, 
        549943155, 
        1359925783, 
        63746842, 
        363235361, 
        160590087, 
        145454774, 
        400204929, 
        61176813, 
        305904609, 
        905003126, 
        525884721, 
        38273599, 
        1383540984, 
        55150252, 
        2191644156, 
        2198463300, 
        169732925, 
        2194570832, 
        2194476224, 
        48882533, 
        19629994, 
        29519415, 
        113715664, 
        338499873, 
        1923674618, 
        2176466798, 
        70697260, 
        1644572677, 
        22276194, 
        21227559, 
        589247365, 
        133039609, 
        15368404, 
        1967942803, 
        489851075, 
        1971503828, 
        1599056852, 
        564000560, 
        19608297, 
        817750260, 
        1962927193, 
        1396954464, 
        1949127900, 
        57285530, 
        1621800300, 
        61610186, 
        913932240, 
        74354125, 
        1499681311, 
        238803918, 
        331846094, 
        176950313, 
        1365485911, 
        1312178624, 
        1330058310, 
        20373507, 
        1230727519, 
        568524382, 
        846695917, 
        301455101, 
        557483911, 
        1115232775, 
        347428794, 
        19770949, 
        624168118, 
        14420911, 
        862153513, 
        57843462, 
        56610105, 
        483701437, 
        577609337, 
        615957642, 
        279986432, 
        595485640, 
        421496539, 
        217521406, 
        519090412, 
        25138024, 
        495492153, 
        421643092, 
        116771362, 
        432642030, 
        414269741, 
        22563878, 
        70043815, 
        218927637, 
        52464795, 
        392794288, 
        346787124, 
        20039495, 
        196378622, 
        22951778, 
        270532560, 
        85104567, 
        426345047, 
        252352684, 
        110259546, 
        297774715, 
        273292902, 
        349178709, 
        80597452, 
        64527922, 
        33089870, 
        48423946, 
        17467699, 
        44988271, 
        28459177, 
        243891740, 
        14304099, 
        346333184, 
        240775015, 
        386780790, 
        24695651, 
        42334975, 
        328296035, 
        117791218, 
        95745642, 
        204939159, 
        67975865, 
        113130893, 
        213328736, 
        253102236, 
        249164327, 
        166566952, 
        259301920, 
        25032325, 
        120175490, 
        191118596, 
        98254428, 
        40282600, 
        24835575, 
        101452460, 
        38121471, 
        222673639, 
        284628793, 
        26008576, 
        213459090, 
        300828991, 
        76717455, 
        17929843, 
        57933964, 
        26436145, 
        33947545, 
        214691411, 
        405613790, 
        64503201, 
        241696179, 
        118885197, 
        85483134, 
        85996512, 
        165281068, 
        73555221, 
        18020516, 
        195391489, 
        327797896, 
        251927528, 
        378914069, 
        26395279, 
        38347589, 
        86680058, 
        39823988, 
        399006956, 
        161851857, 
        35233725, 
        164144287, 
        129718305, 
        27336408, 
        30083429, 
        390754702, 
        245857720, 
        194215491, 
        380499123, 
        14634117, 
        32774989, 
        17659312, 
        20005711
    ], 
    "next_cursor": 0, 
    "previous_cursor_str": "0"
}

In [20]:
followers['ids']


Out[20]:
[195389374,
 2675359034L,
 764200507,
 82935993,
 128667281,
 2283671491L,
 2647943364L,
 1541453431,
 105530298,
 1376777845,
 1193386063,
 2577655255L,
 101872680,
 808810952,
 373209275,
 20767418,
 546427350,
 1376745775,
 611960552,
 2469609312L,
 2407694624L,
 2472881323L,
 185576252,
 1465216634,
 605535912,
 2285714510L,
 509263399,
 2463488694L,
 2278096644L,
 2421174531L,
 2293387526L,
 2456825107L,
 229553670,
 2450012678L,
 197543826,
 226662318,
 44004125,
 17272735,
 2360215847L,
 1534563864,
 321466849,
 199358449,
 52559650,
 2442629124L,
 35099519,
 88258912,
 91298698,
 21405294,
 2353286928L,
 351045168,
 16443199,
 27330602,
 626333479,
 176418075,
 23273418,
 2313689048L,
 296945357,
 574738274,
 298150971,
 518744362,
 136337303,
 240814238,
 31185679,
 587821634,
 867925855,
 589656990,
 28697382,
 29282292,
 459647073,
 48619626,
 33855028,
 119028262,
 473363659,
 1493052828,
 378595093,
 486760024,
 484399952,
 165061127,
 614394728,
 16351985,
 329159493,
 588035925,
 16733220,
 734295223,
 2331191636L,
 37517729,
 543100042,
 709410859,
 173618370,
 72850869,
 359700577,
 219844156,
 445948611,
 1070098525,
 39804332,
 778361120,
 336395471,
 394168530,
 71670623,
 517074349,
 1862614951,
 197620898,
 212675564,
 61380690,
 2414851572L,
 2382843866L,
 139293290,
 2367546340L,
 1286286968,
 569535555,
 2358644534L,
 2352918288L,
 28184011,
 166684510,
 2307892154L,
 2289217188L,
 2288700246L,
 28874028,
 1525342994,
 2270301583L,
 2241275954L,
 930243738,
 42659389,
 505254241,
 2221219284L,
 2178136674L,
 2211750831L,
 2232324578L,
 20125535,
 1617908406,
 2220246115L,
 126729291,
 21788991,
 1961252232,
 2226965178L,
 28427014,
 2159289018L,
 1624891369,
 424939667,
 47998954,
 1711482636,
 495283643,
 159113107,
 34280715,
 2159367186L,
 307103173,
 1859560908,
 46041178,
 598083936,
 223475297,
 142843687,
 20357537,
 1355842375,
 1594391604,
 2208999444L,
 517693044,
 382266933,
 89404689,
 1606449200,
 88827567,
 2176283142L,
 359189186,
 556027829,
 2156296571L,
 185382579,
 1752007092,
 1367365358,
 1588670383,
 1004677340,
 549943155,
 1359925783,
 63746842,
 363235361,
 160590087,
 145454774,
 400204929,
 61176813,
 305904609,
 905003126,
 525884721,
 38273599,
 1383540984,
 55150252,
 2191644156L,
 2198463300L,
 169732925,
 2194570832L,
 2194476224L,
 48882533,
 19629994,
 29519415,
 113715664,
 338499873,
 1923674618,
 2176466798L,
 70697260,
 1644572677,
 22276194,
 21227559,
 589247365,
 133039609,
 15368404,
 1967942803,
 489851075,
 1971503828,
 1599056852,
 564000560,
 19608297,
 817750260,
 1962927193,
 1396954464,
 1949127900,
 57285530,
 1621800300,
 61610186,
 913932240,
 74354125,
 1499681311,
 238803918,
 331846094,
 176950313,
 1365485911,
 1312178624,
 1330058310,
 20373507,
 1230727519,
 568524382,
 846695917,
 301455101,
 557483911,
 1115232775,
 347428794,
 19770949,
 624168118,
 14420911,
 862153513,
 57843462,
 56610105,
 483701437,
 577609337,
 615957642,
 279986432,
 595485640,
 421496539,
 217521406,
 519090412,
 25138024,
 495492153,
 421643092,
 116771362,
 432642030,
 414269741,
 22563878,
 70043815,
 218927637,
 52464795,
 392794288,
 346787124,
 20039495,
 196378622,
 22951778,
 270532560,
 85104567,
 426345047,
 252352684,
 110259546,
 297774715,
 273292902,
 349178709,
 80597452,
 64527922,
 33089870,
 48423946,
 17467699,
 44988271,
 28459177,
 243891740,
 14304099,
 346333184,
 240775015,
 386780790,
 24695651,
 42334975,
 328296035,
 117791218,
 95745642,
 204939159,
 67975865,
 113130893,
 213328736,
 253102236,
 249164327,
 166566952,
 259301920,
 25032325,
 120175490,
 191118596,
 98254428,
 40282600,
 24835575,
 101452460,
 38121471,
 222673639,
 284628793,
 26008576,
 213459090,
 300828991,
 76717455,
 17929843,
 57933964,
 26436145,
 33947545,
 214691411,
 405613790,
 64503201,
 241696179,
 118885197,
 85483134,
 85996512,
 165281068,
 73555221,
 18020516,
 195391489,
 327797896,
 251927528,
 378914069,
 26395279,
 38347589,
 86680058,
 39823988,
 399006956,
 161851857,
 35233725,
 164144287,
 129718305,
 27336408,
 30083429,
 390754702,
 245857720,
 194215491,
 380499123,
 14634117,
 32774989,
 17659312,
 20005711]

In [ ]: