Spark Context

Let's start with creating a SparkContext - an entry point to Spark application. Parameter 'local[*]' means that we create the Spark cluster locally using all machine cores. Next we check that everything is working fine.


In [4]:
import pyspark
sc = pyspark.SparkContext('local[*]')

# do something to prove it works
rdd = sc.parallelize(range(1000))
rdd.takeSample(False, 5)


Out[4]:
[937, 982, 727, 184, 404]

RDD

RDD is just a collection (like a list) but distributed. You can create it using sc.parallelize method that consumes normal collection.


In [5]:
rdd = sc.parallelize(range(100000))

In [6]:
rdd


Out[6]:
PythonRDD[4] at RDD at PythonRDD.scala:43

You can convert it back to python list using collect method:


In [7]:
rdd.collect()


Out[7]:
[0,
 1,
 2,
 3,
 4,
 5,
 6,
 7,
 8,
 9,
 10,
 11,
 12,
 13,
 14,
 15,
 16,
 17,
 18,
 19,
 20,
 21,
 22,
 23,
 24,
 25,
 26,
 27,
 28,
 29,
 30,
 31,
 32,
 33,
 34,
 35,
 36,
 37,
 38,
 39,
 40,
 41,
 42,
 43,
 44,
 45,
 46,
 47,
 48,
 49,
 50,
 51,
 52,
 53,
 54,
 55,
 56,
 57,
 58,
 59,
 60,
 61,
 62,
 63,
 64,
 65,
 66,
 67,
 68,
 69,
 70,
 71,
 72,
 73,
 74,
 75,
 76,
 77,
 78,
 79,
 80,
 81,
 82,
 83,
 84,
 85,
 86,
 87,
 88,
 89,
 90,
 91,
 92,
 93,
 94,
 95,
 96,
 97,
 98,
 99,
 100,
 101,
 102,
 103,
 104,
 105,
 106,
 107,
 108,
 109,
 110,
 111,
 112,
 113,
 114,
 115,
 116,
 117,
 118,
 119,
 120,
 121,
 122,
 123,
 124,
 125,
 126,
 127,
 128,
 129,
 130,
 131,
 132,
 133,
 134,
 135,
 136,
 137,
 138,
 139,
 140,
 141,
 142,
 143,
 144,
 145,
 146,
 147,
 148,
 149,
 150,
 151,
 152,
 153,
 154,
 155,
 156,
 157,
 158,
 159,
 160,
 161,
 162,
 163,
 164,
 165,
 166,
 167,
 168,
 169,
 170,
 171,
 172,
 173,
 174,
 175,
 176,
 177,
 178,
 179,
 180,
 181,
 182,
 183,
 184,
 185,
 186,
 187,
 188,
 189,
 190,
 191,
 192,
 193,
 194,
 195,
 196,
 197,
 198,
 199,
 200,
 201,
 202,
 203,
 204,
 205,
 206,
 207,
 208,
 209,
 210,
 211,
 212,
 213,
 214,
 215,
 216,
 217,
 218,
 219,
 220,
 221,
 222,
 223,
 224,
 225,
 226,
 227,
 228,
 229,
 230,
 231,
 232,
 233,
 234,
 235,
 236,
 237,
 238,
 239,
 240,
 241,
 242,
 243,
 244,
 245,
 246,
 247,
 248,
 249,
 250,
 251,
 252,
 253,
 254,
 255,
 256,
 257,
 258,
 259,
 260,
 261,
 262,
 263,
 264,
 265,
 266,
 267,
 268,
 269,
 270,
 271,
 272,
 273,
 274,
 275,
 276,
 277,
 278,
 279,
 280,
 281,
 282,
 283,
 284,
 285,
 286,
 287,
 288,
 289,
 290,
 291,
 292,
 293,
 294,
 295,
 296,
 297,
 298,
 299,
 300,
 301,
 302,
 303,
 304,
 305,
 306,
 307,
 308,
 309,
 310,
 311,
 312,
 313,
 314,
 315,
 316,
 317,
 318,
 319,
 320,
 321,
 322,
 323,
 324,
 325,
 326,
 327,
 328,
 329,
 330,
 331,
 332,
 333,
 334,
 335,
 336,
 337,
 338,
 339,
 340,
 341,
 342,
 343,
 344,
 345,
 346,
 347,
 348,
 349,
 350,
 351,
 352,
 353,
 354,
 355,
 356,
 357,
 358,
 359,
 360,
 361,
 362,
 363,
 364,
 365,
 366,
 367,
 368,
 369,
 370,
 371,
 372,
 373,
 374,
 375,
 376,
 377,
 378,
 379,
 380,
 381,
 382,
 383,
 384,
 385,
 386,
 387,
 388,
 389,
 390,
 391,
 392,
 393,
 394,
 395,
 396,
 397,
 398,
 399,
 400,
 401,
 402,
 403,
 404,
 405,
 406,
 407,
 408,
 409,
 410,
 411,
 412,
 413,
 414,
 415,
 416,
 417,
 418,
 419,
 420,
 421,
 422,
 423,
 424,
 425,
 426,
 427,
 428,
 429,
 430,
 431,
 432,
 433,
 434,
 435,
 436,
 437,
 438,
 439,
 440,
 441,
 442,
 443,
 444,
 445,
 446,
 447,
 448,
 449,
 450,
 451,
 452,
 453,
 454,
 455,
 456,
 457,
 458,
 459,
 460,
 461,
 462,
 463,
 464,
 465,
 466,
 467,
 468,
 469,
 470,
 471,
 472,
 473,
 474,
 475,
 476,
 477,
 478,
 479,
 480,
 481,
 482,
 483,
 484,
 485,
 486,
 487,
 488,
 489,
 490,
 491,
 492,
 493,
 494,
 495,
 496,
 497,
 498,
 499,
 500,
 501,
 502,
 503,
 504,
 505,
 506,
 507,
 508,
 509,
 510,
 511,
 512,
 513,
 514,
 515,
 516,
 517,
 518,
 519,
 520,
 521,
 522,
 523,
 524,
 525,
 526,
 527,
 528,
 529,
 530,
 531,
 532,
 533,
 534,
 535,
 536,
 537,
 538,
 539,
 540,
 541,
 542,
 543,
 544,
 545,
 546,
 547,
 548,
 549,
 550,
 551,
 552,
 553,
 554,
 555,
 556,
 557,
 558,
 559,
 560,
 561,
 562,
 563,
 564,
 565,
 566,
 567,
 568,
 569,
 570,
 571,
 572,
 573,
 574,
 575,
 576,
 577,
 578,
 579,
 580,
 581,
 582,
 583,
 584,
 585,
 586,
 587,
 588,
 589,
 590,
 591,
 592,
 593,
 594,
 595,
 596,
 597,
 598,
 599,
 600,
 601,
 602,
 603,
 604,
 605,
 606,
 607,
 608,
 609,
 610,
 611,
 612,
 613,
 614,
 615,
 616,
 617,
 618,
 619,
 620,
 621,
 622,
 623,
 624,
 625,
 626,
 627,
 628,
 629,
 630,
 631,
 632,
 633,
 634,
 635,
 636,
 637,
 638,
 639,
 640,
 641,
 642,
 643,
 644,
 645,
 646,
 647,
 648,
 649,
 650,
 651,
 652,
 653,
 654,
 655,
 656,
 657,
 658,
 659,
 660,
 661,
 662,
 663,
 664,
 665,
 666,
 667,
 668,
 669,
 670,
 671,
 672,
 673,
 674,
 675,
 676,
 677,
 678,
 679,
 680,
 681,
 682,
 683,
 684,
 685,
 686,
 687,
 688,
 689,
 690,
 691,
 692,
 693,
 694,
 695,
 696,
 697,
 698,
 699,
 700,
 701,
 702,
 703,
 704,
 705,
 706,
 707,
 708,
 709,
 710,
 711,
 712,
 713,
 714,
 715,
 716,
 717,
 718,
 719,
 720,
 721,
 722,
 723,
 724,
 725,
 726,
 727,
 728,
 729,
 730,
 731,
 732,
 733,
 734,
 735,
 736,
 737,
 738,
 739,
 740,
 741,
 742,
 743,
 744,
 745,
 746,
 747,
 748,
 749,
 750,
 751,
 752,
 753,
 754,
 755,
 756,
 757,
 758,
 759,
 760,
 761,
 762,
 763,
 764,
 765,
 766,
 767,
 768,
 769,
 770,
 771,
 772,
 773,
 774,
 775,
 776,
 777,
 778,
 779,
 780,
 781,
 782,
 783,
 784,
 785,
 786,
 787,
 788,
 789,
 790,
 791,
 792,
 793,
 794,
 795,
 796,
 797,
 798,
 799,
 800,
 801,
 802,
 803,
 804,
 805,
 806,
 807,
 808,
 809,
 810,
 811,
 812,
 813,
 814,
 815,
 816,
 817,
 818,
 819,
 820,
 821,
 822,
 823,
 824,
 825,
 826,
 827,
 828,
 829,
 830,
 831,
 832,
 833,
 834,
 835,
 836,
 837,
 838,
 839,
 840,
 841,
 842,
 843,
 844,
 845,
 846,
 847,
 848,
 849,
 850,
 851,
 852,
 853,
 854,
 855,
 856,
 857,
 858,
 859,
 860,
 861,
 862,
 863,
 864,
 865,
 866,
 867,
 868,
 869,
 870,
 871,
 872,
 873,
 874,
 875,
 876,
 877,
 878,
 879,
 880,
 881,
 882,
 883,
 884,
 885,
 886,
 887,
 888,
 889,
 890,
 891,
 892,
 893,
 894,
 895,
 896,
 897,
 898,
 899,
 900,
 901,
 902,
 903,
 904,
 905,
 906,
 907,
 908,
 909,
 910,
 911,
 912,
 913,
 914,
 915,
 916,
 917,
 918,
 919,
 920,
 921,
 922,
 923,
 924,
 925,
 926,
 927,
 928,
 929,
 930,
 931,
 932,
 933,
 934,
 935,
 936,
 937,
 938,
 939,
 940,
 941,
 942,
 943,
 944,
 945,
 946,
 947,
 948,
 949,
 950,
 951,
 952,
 953,
 954,
 955,
 956,
 957,
 958,
 959,
 960,
 961,
 962,
 963,
 964,
 965,
 966,
 967,
 968,
 969,
 970,
 971,
 972,
 973,
 974,
 975,
 976,
 977,
 978,
 979,
 980,
 981,
 982,
 983,
 984,
 985,
 986,
 987,
 988,
 989,
 990,
 991,
 992,
 993,
 994,
 995,
 996,
 997,
 998,
 999,
 ...]

RDD operations

Main operations on RDD are transformations:

  • map - apply a function to every element of the collection;
  • filter - filter collection using predicate;
  • flatMap - apply a function that changes each element into a collection and flatten the results;

... and actions (a. k. a. aggregations):

  • collect - converts RDD to a list;
  • count - counts the number of elements in the RDD;
  • take(n) - takes first n elements of the RDD and returns a list;
  • takeSample(withReplacement, n) - takes a sample of RDD of n elements;
  • reduce(function) - reduces the collection using function;
  • aggregate - aggregates elements of an RDD.

See http://spark.apache.org/docs/1.6.0/api/python/pyspark.html#pyspark.RDD for more information and other usefull functions.

Remember: Transformations are lazy and actions are eager.

Task 1 Fill the ... below to aviod AssertionError:


In [13]:
assert rdd.map(lambda x: x * 13 % 33).take(34)[-1] == ...


---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-13-031785e25a44> in <module>()
----> 1 assert rdd.map(lambda x: x * 13 % 33).take(34)[-1] == ...

AssertionError: 

Task 2 Write count using reduce:


In [17]:
def fun(x, y):
    return "something"

assert rdd.count() == rdd.reduce(fun)


---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-17-377cffa32447> in <module>()
      2     return "something"
      3 
----> 4 assert rdd.count() == rdd.reduce(fun)

AssertionError: 

In [ ]:
assert rdd.sum() == rdd.aggregate(0, lambda x, y: ..., lambda x, y: ...)

In [20]:
rdd2 = rdd.flatMap(lambda x: [x**2 % 8609,  x**3 % 8609])

In [23]:
rdd2.take(10)


Out[23]:
[0, 0, 1, 1, 4, 8, 9, 27, 16, 64]

Task 4 Get the biggest value from in the rdd2 (use reduce):


In [ ]:
assert rdd2.max() == rdd.reduce(lambda x, y: ...)

Task 5 Get the second biggest value from rdd2, use reduce or aggregate.


In [ ]:
rdd2.aggregate(..., ..., ...)

Pair RDD

In many cases it is convenient to work with an RDD that consists of key-value pairs. We can get one by counting different reminders in rdd2.


In [34]:
reminders = rdd2.groupBy(lambda x: x)
reminders.take(10)


Out[34]:
[(0, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2940>),
 (8192, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2d30>),
 (2, <pyspark.resultiterable.ResultIterable at 0x7f48f3df26a0>),
 (4, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2860>),
 (6, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2320>),
 (8, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2748>),
 (10, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2668>),
 (12, <pyspark.resultiterable.ResultIterable at 0x7f48f3df26d8>),
 (8194, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2400>),
 (14, <pyspark.resultiterable.ResultIterable at 0x7f48f3df2710>)]

We only want the lengths of ResultIterable, so we can map values of the key-value pairs:


In [35]:
reminders_counts = reminders.mapValues(lambda x: len(x))
reminders_counts.take(10)


Out[35]:
[(0, 24),
 (8192, 35),
 (2, 34),
 (4, 34),
 (6, 11),
 (8, 35),
 (10, 36),
 (12, 12),
 (8194, 12),
 (14, 12)]

Let's sort it by count:


In [37]:
reminders_counts.sortBy(lambda x: x[1], ascending=False).take(15)


Out[37]:
[(10, 36),
 (32, 36),
 (80, 36),
 (90, 36),
 (98, 36),
 (104, 36),
 (114, 36),
 (186, 36),
 (188, 36),
 (200, 36),
 (264, 36),
 (266, 36),
 (8240, 36),
 (358, 36),
 (394, 36)]

Task 6 Compute the counts of repetitions of reminders: how many times a reminder occured 36 times, and 35 times, and so on...


In [ ]:

You can move to some real dataset!