Text manipulation

Text manipulation is quite simplified in Python thanks to a wide variety of packages. It is generally advisable not to reinvent the wheels, so only perform quick and dirty text parsing when it is really necessary.

File IO, streaming, serialization

Here is a basic raw text file opening template in Python.


In [ ]:
# f =  open('path/to/file.txt','r')
# #f.readlines()
# for l in f:
#     #do stuff
#     print l
# f.close()

What if we want to read text from the standard input? (Useful for pipelining)


In [1]:
#use ike this: cat file.txt | python script.py
import sys
for line in sys.stdin:
    # do suff
    print line

# there is a dedicated module to text IO
#import io
#t = io.TextIO()


  File "<ipython-input-1-dd343f7516e3>", line 5
    print line
             ^
SyntaxError: Missing parentheses in call to 'print'

Pickling

This in Python jargon means object serialization, a very important feature allowing you to save on disk the contents of a Python datastructure directly, in a specially compressed or sometimes binary format.


In [2]:
d = {'first': [1,"two"], 'second': set([3, 4, 'five'])}
import pickle
with open('dumpfile.pkl','wb') as fout:
    pickle.dump(d, fout)
with open('dumpfile.pkl','rb') as fin:
    d2 = pickle.load(fin)
print(d2)


{'first': [1, 'two'], 'second': {'five', 3, 4}}

JSON

A short word for JavaScript Object Notation, .json became ubiquitous as a simple data interchange format mainly in remote Web API calls and microtransactions. Json is easily loaded into native Python datastructures. An example:


In [3]:
import json
#json_string = json.dumps([1, 2, 3, "a", "b", "c"])
d = {'first': [1,"two"], 'second': [3, 4, 'five']}
json_string = json.dumps(d)
print(json_string)


{"first": [1, "two"], "second": [3, 4, "five"]}

Parsing and regular expressions

Used for any raw text format in biology, such as (FASTA, FASTAQ, PDB, VCF, GFF, SAM).

Example: FASTA parsing

Open the file containing all peptide sequences in the human body. How many unknown peptides does it contain? How many unique genes and transcripts are in there for the unknown peptides? Output a tab separated file containing the gene id and transcript id for each unknown peptide.

Observation: Usage of Biopython and pandas modules.

Task: Order the chromosomes by the number of unknown peptides versus the total number of peptides they translate.

ENSP00000388523 pep:known chromosome:GRCh38:7:142300924:142301432:1 gene:ENSG00000226660 transcript:ENST00000455382 gene_biotype:TR_V_gene transcript_biotype:TR_V_gene MDTWLVCWAIFSLLKAGLTEPEVTQTPSHQVTQMGQEVILRCVPISNHLYFYWYRQILGQ KVEFLVSFYNNEISEKSEIFDDQFSVERPDGSNFTLKIRSTKLEDSAMYFCASSE


In [19]:
import sys
f = open('data/Homo_sapiens.GRCh38.pep.all.fa','r')
peptides = {}
for l in f:
    if l[0]=='>':
        #print l.strip().split()
        record = {}
        r = l.strip('\n').split()
        pepid = r[0][1:]
        record['pep'] = 1 if r[1].split(':')[1]=='known' else 0
        record['gene'] = r[3].split(':')[1]
        record['transcript'] = r[4].split(':')[1]
        peptides[pepid] = record
f.close()

##using regular expressions to match all known peptides
nupep2 = 0
import re
#pattern = re.compile('^>.*(known).*')
pattern = re.compile('^>((?!known).)*$')
with open('data/Homo_sapiens.GRCh38.pep.all.fa','r') as f:
    for l in f:
        if pattern.search(l) is not None: nupep2 += 1 

npep = len(peptides)
upep = set([pepid for pepid in peptides if peptides[pepid]['pep']==0]) #unknown peptides
nunknown = len(upep)
genes = set([peptides[pepid]['gene'] for pepid in upep])
trans = set([peptides[pepid]['transcript'] for pepid in upep])
print npep, nupep2, nunknown, len(genes), len(ntrans)


with open('unknown_peptides.txt','w') as f:
    for pepid in upep:
        f.write('\t'.join([pepid, peptides[pepid]['gene'], peptides[pepid]['transcript']])+'\n')


99436 28828 28828 11116 70608

In [3]:
f = open('data/Homo_sapiens.GRCh38.pep.all.fa','r')
from Bio import SeqIO
fasta = SeqIO.parse(f,'fasta')

i = 0
name, sequence = fasta.id, fasta.seq.tostring()
if len(sequence)<100 and len(sequence)>20:
    i += 1
    print i
    print "Name",name
    print "Sequence",sequence
    if i > 5: break
f.close()


---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-3-045956fa1344> in <module>()
      4 
      5 i = 0
----> 6 name, sequence = fasta.id, fasta.seq.tostring()
      7 if len(sequence)<100 and len(sequence)>20:
      8     i += 1

AttributeError: 'generator' object has no attribute 'id'

XML parsing

XML is a general file format used for data interchange, especially among different applications. One of the most popular use in Biology is the SBML format, that aims to store a biological model specification, no matter how specific that model may be.

Task:

Download a curated SBML file from the BioModels database: http://www.ebi.ac.uk/biomodels-main/

Find out how many reactions the file contains.

Extra task:

Make a simplified XML file of the reactants and their k-values for each reaction.


In [13]:
import sys

import xml.etree.ElementTree as ET
tree = ET.ElementTree(file='data/curated_sbml.xml')
#tree = ET.parse(open('data/curated_sbml.xml'))
root = tree.getroot()
print root.tag, root.attrib
for child in root:
    print child.tag, child.attrib
    for child2 in child:
        print child2.tag, child2.attrib

#print tree.write(sys.stdout)
for elem in root.iter('reaction'):
    print elem.tag, elem.attrib

for elem in root.iter('species'):
    print elem.tag, elem.attrib
    print elem.get('id')

print tree.findall('.//reaction')


sbml {'version': '4', 'metaid': '_000000', 'level': '2'}
model {'id': 'BIOMD0000000001', 'metaid': '_000001', 'name': 'Edelstein1996 - EPSP ACh event'}
notes {}
annotation {}
listOfCompartments {}
listOfSpecies {}
listOfParameters {}
listOfReactions {}
listOfEvents {}
reaction {'name': 'React0', 'id': 'React0', 'metaid': '_000016', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React1', 'id': 'React1', 'metaid': '_000017', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React2', 'id': 'React2', 'metaid': '_000018', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React3', 'id': 'React3', 'metaid': '_000019', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React4', 'id': 'React4', 'metaid': '_000020', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React5', 'id': 'React5', 'metaid': '_000021', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React6', 'id': 'React6', 'metaid': '_000022', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React7', 'id': 'React7', 'metaid': '_000023', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React8', 'id': 'React8', 'metaid': '_000024', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React9', 'id': 'React9', 'metaid': '_000025', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React10', 'id': 'React10', 'metaid': '_000026', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React11', 'id': 'React11', 'metaid': '_000027', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React12', 'id': 'React12', 'metaid': '_000028', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React13', 'id': 'React13', 'metaid': '_000029', 'sboTerm': 'SBO:0000177'}
reaction {'name': 'React14', 'id': 'React14', 'metaid': '_000030', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React15', 'id': 'React15', 'metaid': '_000031', 'sboTerm': 'SBO:0000181'}
reaction {'name': 'React16', 'id': 'React16', 'metaid': '_000032', 'sboTerm': 'SBO:0000181'}
species {'name': 'BasalACh2', 'metaid': '_000003', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'BLL', 'initialAmount': '0'}
BLL
species {'name': 'IntermediateACh', 'metaid': '_000004', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'IL', 'initialAmount': '0'}
IL
species {'name': 'ActiveACh', 'metaid': '_000005', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'AL', 'initialAmount': '0'}
AL
species {'name': 'Active', 'metaid': '_000006', 'sboTerm': 'SBO:0000420', 'compartment': 'comp1', 'id': 'A', 'initialAmount': '0'}
A
species {'name': 'BasalACh', 'metaid': '_000007', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'BL', 'initialAmount': '0'}
BL
species {'name': 'Basal', 'metaid': '_000008', 'sboTerm': 'SBO:0000420', 'compartment': 'comp1', 'id': 'B', 'initialAmount': '1.66057788110262E-21'}
B
species {'name': 'DesensitisedACh2', 'metaid': '_000009', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'DLL', 'initialAmount': '0'}
DLL
species {'name': 'Desensitised', 'metaid': '_000010', 'sboTerm': 'SBO:0000420', 'compartment': 'comp1', 'id': 'D', 'initialAmount': '0'}
D
species {'name': 'IntermediateACh2', 'metaid': '_000011', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'ILL', 'initialAmount': '0'}
ILL
species {'name': 'DesensitisedACh', 'metaid': '_000012', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'DL', 'initialAmount': '0'}
DL
species {'name': 'Intermediate', 'metaid': '_000013', 'sboTerm': 'SBO:0000420', 'compartment': 'comp1', 'id': 'I', 'initialAmount': '0'}
I
species {'name': 'ActiveACh2', 'metaid': '_000014', 'sboTerm': 'SBO:0000297', 'compartment': 'comp1', 'id': 'ALL', 'initialAmount': '0'}
ALL
[<Element 'reaction' at 0x7f30d7906490>, <Element 'reaction' at 0x7f30d7906f10>, <Element 'reaction' at 0x7f30d78f3950>, <Element 'reaction' at 0x7f30d7838390>, <Element 'reaction' at 0x7f30d7838d90>, <Element 'reaction' at 0x7f30d78447d0>, <Element 'reaction' at 0x7f30d78511d0>, <Element 'reaction' at 0x7f30d7851bd0>, <Element 'reaction' at 0x7f30d78625d0>, <Element 'reaction' at 0x7f30d7862f90>, <Element 'reaction' at 0x7f30d78a7950>, <Element 'reaction' at 0x7f30d7873350>, <Element 'reaction' at 0x7f30d7873d10>, <Element 'reaction' at 0x7f30d787d710>, <Element 'reaction' at 0x7f30d788f110>, <Element 'reaction' at 0x7f30d788f950>, <Element 'reaction' at 0x7f30d40e81d0>]

xmltodict


In [ ]:
import xmltodict
with open('data/curated_sbml.xml','r') as fd:
    doc = xmltodict.parse(fd.read())

Web scraping

This is concerned with automatic information processing from teh Internet.

Task:

  • Create your own web crawlers, to mine the relevant articles from your favorite journals.

BeautifulSoup is loved by hackers. Aside from html it can also parse xml.

Here is a small script that will list all web anchors from Reddit main page (an anchod is a html tag normally used to provide hyperlinks and reference points inside a web page).


In [4]:
from bs4 import BeautifulSoup
import urllib2

redditFile = urllib2.urlopen("http://www.reddit.com")
redditHtml = redditFile.read()
redditFile.close()

soup = BeautifulSoup(redditHtml)
redditAll = soup.find_all("a")
for links in soup.find_all('a'):
    print (links.get('href'))


#content
http://www.reddit.com/r/Allsvenskan/
http://www.reddit.com/r/Art/
http://www.reddit.com/r/AskReddit/
http://www.reddit.com/r/askscience/
http://www.reddit.com/r/aww/
http://www.reddit.com/r/books/
http://www.reddit.com/r/creepy/
http://www.reddit.com/r/dataisbeautiful/
http://www.reddit.com/r/DIY/
http://www.reddit.com/r/Documentaries/
http://www.reddit.com/r/EarthPorn/
http://www.reddit.com/r/europe/
http://www.reddit.com/r/explainlikeimfive/
http://www.reddit.com/r/Fitness/
http://www.reddit.com/r/food/
http://www.reddit.com/r/funny/
http://www.reddit.com/r/Futurology/
http://www.reddit.com/r/gadgets/
http://www.reddit.com/r/gaming/
http://www.reddit.com/r/GetMotivated/
http://www.reddit.com/r/gifs/
http://www.reddit.com/r/history/
http://www.reddit.com/r/IAmA/
http://www.reddit.com/r/InternetIsBeautiful/
http://www.reddit.com/r/intresseklubben/
http://www.reddit.com/r/Jokes/
http://www.reddit.com/r/LifeProTips/
http://www.reddit.com/r/listentothis/
http://www.reddit.com/r/mildlyinteresting/
http://www.reddit.com/r/movies/
http://www.reddit.com/r/Music/
http://www.reddit.com/r/news/
http://www.reddit.com/r/nosleep/
http://www.reddit.com/r/nottheonion/
http://www.reddit.com/r/OldSchoolCool/
http://www.reddit.com/r/personalfinance/
http://www.reddit.com/r/philosophy/
http://www.reddit.com/r/photoshopbattles/
http://www.reddit.com/r/pics/
http://www.reddit.com/r/science/
http://www.reddit.com/r/Showerthoughts/
http://www.reddit.com/r/space/
http://www.reddit.com/r/spop/
http://www.reddit.com/r/sports/
http://www.reddit.com/r/svenskpolitik/
http://www.reddit.com/r/SWARJE/
http://www.reddit.com/r/sweden/
http://www.reddit.com/r/swedishproblems/
http://www.reddit.com/r/television/
http://www.reddit.com/r/tifu/
http://www.reddit.com/r/todayilearned/
http://www.reddit.com/r/TwoXChromosomes/
http://www.reddit.com/r/UpliftingNews/
http://www.reddit.com/r/videos/
http://www.reddit.com/r/worldnews/
http://www.reddit.com/r/WritingPrompts/
http://www.reddit.com/subreddits/
http://www.reddit.com/
http://www.reddit.com/r/all
http://www.reddit.com/r/random/
http://www.reddit.com/r/gadgets/
http://www.reddit.com/r/sports/
http://www.reddit.com/r/gaming/
http://www.reddit.com/r/pics/
http://www.reddit.com/r/worldnews/
http://www.reddit.com/r/videos/
http://www.reddit.com/r/AskReddit/
http://www.reddit.com/r/aww/
http://www.reddit.com/r/Music/
http://www.reddit.com/r/funny/
http://www.reddit.com/r/news/
http://www.reddit.com/r/movies/
http://www.reddit.com/r/books/
http://www.reddit.com/r/europe/
http://www.reddit.com/r/history/
http://www.reddit.com/r/food/
http://www.reddit.com/r/philosophy/
http://www.reddit.com/r/television/
http://www.reddit.com/r/Jokes/
http://www.reddit.com/r/Art/
http://www.reddit.com/r/DIY/
http://www.reddit.com/r/space/
http://www.reddit.com/r/Documentaries/
http://www.reddit.com/r/Fitness/
http://www.reddit.com/r/askscience/
http://www.reddit.com/r/nottheonion/
http://www.reddit.com/r/sweden/
http://www.reddit.com/r/todayilearned/
http://www.reddit.com/r/personalfinance/
http://www.reddit.com/r/gifs/
http://www.reddit.com/r/listentothis/
http://www.reddit.com/r/IAmA/
http://www.reddit.com/r/TwoXChromosomes/
http://www.reddit.com/r/creepy/
http://www.reddit.com/r/nosleep/
http://www.reddit.com/r/GetMotivated/
http://www.reddit.com/r/WritingPrompts/
http://www.reddit.com/r/LifeProTips/
http://www.reddit.com/r/EarthPorn/
http://www.reddit.com/r/explainlikeimfive/
http://www.reddit.com/r/Showerthoughts/
http://www.reddit.com/r/Futurology/
http://www.reddit.com/r/photoshopbattles/
http://www.reddit.com/r/mildlyinteresting/
http://www.reddit.com/r/dataisbeautiful/
http://www.reddit.com/r/Allsvenskan/
http://www.reddit.com/r/tifu/
http://www.reddit.com/r/svenskpolitik/
http://www.reddit.com/r/OldSchoolCool/
http://www.reddit.com/r/UpliftingNews/
http://www.reddit.com/r/spop/
http://www.reddit.com/r/InternetIsBeautiful/
http://www.reddit.com/r/swedishproblems/
http://www.reddit.com/r/SWARJE/
http://www.reddit.com/r/intresseklubben/
http://www.reddit.com/r/science/
http://www.reddit.com/subreddits/
/
http://www.reddit.com/
http://www.reddit.com/new/
http://www.reddit.com/rising/
http://www.reddit.com/controversial/
http://www.reddit.com/top/
http://www.reddit.com/gilded/
http://www.reddit.com/wiki/
http://www.reddit.com/ads/
https://www.reddit.com/login
javascript:void(0)
http://www.reddit.com/wiki/search
http://www.reddit.com/wiki/search
/password
http://www.reddit.com/submit
http://www.reddit.com/submit?selftext=true
/gold?goldtype=code&source=progressbar
/gold/about
/r/goldbenefits
http://en.wikipedia.org/wiki/Pacific_Time_Zone
None
/newsletter
#
javascript:void(0)
javascript:void(0)
https://m.youtube.com/watch?v=O_4OfD-wmGs
/domain/m.youtube.com/
http://www.reddit.com/user/Mr_Self__Destruct
http://www.reddit.com/r/Music/
http://www.reddit.com/r/Music/comments/33yp3x/metallica_sad_but_true_heavy_metal/
#
#
/r/Fitness/comments/33yoo0/lifting_and_ape_index/
/r/Fitness/
http://www.reddit.com/user/WeeLittleShenanigans
http://www.reddit.com/r/Fitness/
http://www.reddit.com/r/Fitness/comments/33yoo0/lifting_and_ape_index/
#
#
/r/AskReddit/comments/33ypco/whats_a_story_your_grandparents_told_you_that/
/r/AskReddit/
http://www.reddit.com/user/Vilokthoria
http://www.reddit.com/r/AskReddit/
http://www.reddit.com/r/AskReddit/comments/33ypco/whats_a_story_your_grandparents_told_you_that/
#
#
/r/Music/comments/33yp43/bored_dj_at_a_strip_club/
/r/Music/
http://www.reddit.com/user/xcancerificx
http://www.reddit.com/r/Music/
http://www.reddit.com/r/Music/comments/33yp43/bored_dj_at_a_strip_club/
#
#
http://imgur.com/8PE2dQ1&jfOv5Mt#1
http://imgur.com/8PE2dQ1&jfOv5Mt#1
/domain/imgur.com/
http://www.reddit.com/user/Insertfuckgivenhere
http://www.reddit.com/r/gaming/
http://www.reddit.com/r/gaming/comments/33yp58/paca_plus_aka_my_new_favorite_game_xpost_from/
#
#
http://i.imgur.com/vjNm4Ai.png?1
http://i.imgur.com/vjNm4Ai.png?1
/domain/i.imgur.com/
http://www.reddit.com/user/KlassyBoy
http://www.reddit.com/r/sweden/
http://www.reddit.com/r/sweden/comments/33ymj4/en_guide_till_tf2_ft_uisterpuck_ps_fler_av_er/
#
#
http://gfycat.com/SlushyShoddyCassowary
http://gfycat.com/SlushyShoddyCassowary
/domain/gfycat.com/
http://www.reddit.com/user/Moodh
http://www.reddit.com/r/Allsvenskan/
http://www.reddit.com/r/Allsvenskan/comments/33yivo/martin_ericsson_rullar_in_segermålet_för_häcken/
#
#
/r/Jokes/comments/33yoo8/life_is_like_a_box_of_chocolates/
/r/Jokes/
http://www.reddit.com/user/LeTruth
http://www.reddit.com/r/Jokes/
http://www.reddit.com/r/Jokes/comments/33yoo8/life_is_like_a_box_of_chocolates/
#
#
http://i.imgur.com/TNQnQRL.gif
http://i.imgur.com/TNQnQRL.gif
/domain/i.imgur.com/
http://www.reddit.com/user/amlb146
http://www.reddit.com/r/funny/
http://www.reddit.com/r/funny/comments/33ypat/me_while_flirting/
#
#
/r/AskReddit/comments/33ypch/if_every_time_you_looked_at_someone_they_would/
/r/AskReddit/
http://www.reddit.com/user/Allegretta
http://www.reddit.com/r/AskReddit/
http://www.reddit.com/r/AskReddit/comments/33ypch/if_every_time_you_looked_at_someone_they_would/
#
#
/r/random
http://www.reddit.com/wiki/selfserve
http://www.reddit.com/advertising
http://www.reddit.com/subreddits/
/r/FluidMechanics
/r/ImaginaryWesteros
/r/perfect_response
/r/turtlesonalligators
/r/AppleWatch
/r/trendingsubreddits/comments/33wi5r/trending_subreddits_for_20150426_rfluidmechanics/
https://www.youtube.com/watch?v=_JC_wIWUC2U
https://www.youtube.com/watch?v=_JC_wIWUC2U
/domain/youtube.com/
http://www.reddit.com/user/AChimimunima
http://www.reddit.com/r/videos/
http://www.reddit.com/r/videos/comments/33xtdn/hit_by_avalanche_in_everest_basecamp_25042015/
#
#
http://i.imgur.com/ZEroinM.jpg
http://i.imgur.com/ZEroinM.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/Valens
http://www.reddit.com/r/funny/
http://www.reddit.com/r/funny/comments/33xoys/russian_closet/
#
#
http://i.imgur.com/yDlFd2q.gif
http://i.imgur.com/yDlFd2q.gif
/domain/i.imgur.com/
http://www.reddit.com/user/MrVanillacoke
http://www.reddit.com/r/gifs/
http://www.reddit.com/r/gifs/comments/33xpma/the_alternate_lord_of_the_rings_plot/
#
#
http://static.boredpanda.com/blog/wp-content/uploads/2015/04/night-sky-stars-milky-way-photography-36__880.jpg
http://static.boredpanda.com/blog/wp-content/uploads/2015/04/night-sky-stars-milky-way-photography-36__880.jpg
/domain/static.boredpanda.com/
http://www.reddit.com/user/BlackShadowRose
http://www.reddit.com/r/pics/
http://www.reddit.com/r/pics/comments/33xkij/starry_night_sky_over_clam_waters/
#
#
https://i.imgur.com/8eE1wok.gifv
https://i.imgur.com/8eE1wok.gifv
/domain/i.imgur.com/
http://www.reddit.com/user/Isai76
http://www.reddit.com/r/gaming/
http://www.reddit.com/r/gaming/comments/33xcy7/close_call/
#
#
http://www.sentinelsource.com/features/technology/japan-plans-to-land-rover-on-moon-in/article_aec6919e-ef32-5b61-8a55-ca7ccacefafd.html
/domain/sentinelsource.com/
http://www.reddit.com/user/Stewpid
http://www.reddit.com/r/worldnews/
http://www.reddit.com/r/worldnews/comments/33x7iz/japan_plans_to_land_rover_on_moon_in_2018/
#
#
http://stuffdutchpeoplelike.com/2011/07/26/dutch-swears-with-diseases/
http://stuffdutchpeoplelike.com/2011/07/26/dutch-swears-with-diseases/
/domain/stuffdutchpeoplelike.com/
http://www.reddit.com/user/sacara
http://www.reddit.com/r/todayilearned/
http://www.reddit.com/r/todayilearned/comments/33x68w/til_in_the_netherlands_people_swear_with_diseases/
#
#
http://i.imgur.com/tMfLvmF.jpg
http://i.imgur.com/tMfLvmF.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/Valens
http://www.reddit.com/r/aww/
http://www.reddit.com/r/aww/comments/33x53z/the_iron_throne/
#
#
http://i.imgur.com/TGrAQNl.jpg
http://i.imgur.com/TGrAQNl.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/Austere1
http://www.reddit.com/r/mildlyinteresting/
http://www.reddit.com/r/mildlyinteresting/comments/33xadn/this_breakfast_menu_uses_the_archer_font/
#
#
http://www.seattletimes.com/seattle-news/anonymous-donor-pays-off-landslide-victims-360k-mortgage/
http://www.seattletimes.com/seattle-news/anonymous-donor-pays-off-landslide-victims-360k-mortgage/
/domain/seattletimes.com/
http://www.reddit.com/user/NO__CAPE
http://www.reddit.com/r/UpliftingNews/
http://www.reddit.com/r/UpliftingNews/comments/33xsn2/anonymous_donor_pays_off_360000_mortgage_for_man/
#
#
http://imgur.com/a/KBFD2
/domain/imgur.com/
http://www.reddit.com/user/JacksonCash
http://www.reddit.com/r/DIY/
http://www.reddit.com/r/DIY/comments/33x371/made_my_hallway_floor_look_less_awful/
#
#
/r/AskReddit/comments/33wv9k/who_are_some_onehit_wonders_in_fields_other_than/
/r/AskReddit/
http://www.reddit.com/user/POCKALEELEE
http://www.reddit.com/r/AskReddit/
http://www.reddit.com/r/AskReddit/comments/33wv9k/who_are_some_onehit_wonders_in_fields_other_than/
#
#
http://www.rollingstone.com/movies/news/super-troopers-2-officially-a-go-after-crowdfunding-4-4-million-20150425#ixzz3YM1O5kDW
http://www.rollingstone.com/movies/news/super-troopers-2-officially-a-go-after-crowdfunding-4-4-million-20150425#ixzz3YM1O5kDW
/domain/rollingstone.com/
http://www.reddit.com/user/annekar
http://www.reddit.com/r/movies/
http://www.reddit.com/r/movies/comments/33wuxz/super_troopers_2_officially_a_go_after/
#
#
http://www.post-gazette.com/news/state/2015/04/26/Study-Majority-of-Americans-prefer-gun-rights-over-expanded-gun-control/stories/201504260131
/domain/post-gazette.com/
http://www.reddit.com/user/ercax
http://www.reddit.com/r/news/
http://www.reddit.com/r/news/comments/33wy9p/study_majority_of_americans_prefer_gun_rights/
#
#
http://jakarta.coconuts.co/2015/04/17/there-are-vampire-squirrels-giant-fluffy-tails-stalking-jungles-borneo
http://jakarta.coconuts.co/2015/04/17/there-are-vampire-squirrels-giant-fluffy-tails-stalking-jungles-borneo
/domain/jakarta.coconuts.co/
http://www.reddit.com/user/BirdPersonJr
http://www.reddit.com/r/nottheonion/
http://www.reddit.com/r/nottheonion/comments/33xifo/scientists_prove_vampire_squirrels_of_borneo_have/
#
#
http://i.imgur.com/MH0BkpR.jpg
http://i.imgur.com/MH0BkpR.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/fourlanterns
http://www.reddit.com/r/photoshopbattles/
http://www.reddit.com/r/photoshopbattles/comments/33xwsa/psbattle_this_baby_being_held_by_a_horseheaded_man/
#
#
/r/Showerthoughts/comments/33wx1b/when_i_was_a_kid_people_got_really_pissed_if_you/
/r/Showerthoughts/
http://www.reddit.com/user/fatalis_vox
http://www.reddit.com/r/Showerthoughts/
http://www.reddit.com/r/Showerthoughts/comments/33wx1b/when_i_was_a_kid_people_got_really_pissed_if_you/
#
#
/r/IAmA/comments/33wtmk/iama_92_year_old_woman_from_stuttgart_germany_and/
/r/IAmA/comments/33wtmk/iama_92_year_old_woman_from_stuttgart_germany_and/
/r/IAmA/
http://www.reddit.com/user/Jasoni92
http://www.reddit.com/r/IAmA/
http://www.reddit.com/r/IAmA/comments/33wtmk/iama_92_year_old_woman_from_stuttgart_germany_and/
#
#
http://i.imgur.com/dldG7pQ.gifv
http://i.imgur.com/dldG7pQ.gifv
/domain/i.imgur.com/
http://www.reddit.com/user/Isai76
http://www.reddit.com/r/sports/
http://www.reddit.com/r/sports/comments/33ws83/high_school_baseball_player_catches_a_curve_ball/
#
#
http://i.imgur.com/ZO9M6Dr.jpg
http://i.imgur.com/ZO9M6Dr.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/Prooffread3r
http://www.reddit.com/r/OldSchoolCool/
http://www.reddit.com/r/OldSchoolCool/comments/33x2zw/mata_hari_1910s/
#
#
http://www.futurism.co/wp-content/uploads/2015/04/Science_Apr-26th_2015.jpg
http://www.futurism.co/wp-content/uploads/2015/04/Science_Apr-26th_2015.jpg
/domain/futurism.co/
http://www.reddit.com/user/Portis403
http://www.reddit.com/r/Futurology/
http://www.reddit.com/r/Futurology/comments/33xd0d/this_week_in_science_genetically_modifying_human/
#
#
/r/LifeProTips/comments/33x2oi/lpt_when_using_google_or_apple_maps_tapping_the/
/r/LifeProTips/comments/33x2oi/lpt_when_using_google_or_apple_maps_tapping_the/
/r/LifeProTips/
http://www.reddit.com/user/victorykings
http://www.reddit.com/r/LifeProTips/
http://www.reddit.com/r/LifeProTips/comments/33x2oi/lpt_when_using_google_or_apple_maps_tapping_the/
#
#
/r/askscience/comments/33xuxu/if_sound_could_travel_through_space_how_loud/
/r/askscience/
http://www.reddit.com/user/ImTheConan
http://www.reddit.com/r/askscience/
http://www.reddit.com/r/askscience/comments/33xuxu/if_sound_could_travel_through_space_how_loud/
#
#
http://i.imgur.com/Z7HpG0q.jpg
http://i.imgur.com/Z7HpG0q.jpg
/domain/i.imgur.com/
http://www.reddit.com/user/DarthButane
http://www.reddit.com/r/EarthPorn/
http://www.reddit.com/r/EarthPorn/comments/33wmj9/a_different_kind_of_beauty_twilight_on_the_tundra/
#
#
http://i.imgur.com/OtIMjqf.png
http://i.imgur.com/OtIMjqf.png
/domain/i.imgur.com/
http://www.reddit.com/user/termderd
http://www.reddit.com/r/space/
http://www.reddit.com/r/space/comments/33wgqf/i_took_my_nephews_to_kennedy_space_center_in/
#
#
http://www.reddit.com/?count=25&after=t3_33wgqf
http://www.reddit.com/r/random
http://www.reddit.com/blog/
http://www.reddit.com/about/
http://www.reddit.com/about/team/
http://www.reddit.com/code/
http://www.reddit.com/advertising/
http://www.reddit.com/jobs/
http://www.reddit.com/rules/
http://www.reddit.com/wiki/faq/
http://www.reddit.com/wiki/
http://www.reddit.com/wiki/reddiquette/
http://www.reddit.com/wiki/transparency/
http://www.reddit.com/contact/
http://alienblue.org
http://redditama.reddit.com/
http://i.reddit.com
http://www.reddit.com/buttons/
http://www.reddit.com/gold/about/
http://redditmarket.com
http://redditgifts.com
http://reddit.tv
http://radioreddit.com
http://www.reddit.com/help/useragreement
http://www.reddit.com/help/privacypolicy

In [ ]: