Web Scraping using PySpark (with 3 nodes) and BeautifulSoup

survival8: Personal Posts Menu

Personal Posts Menu

156 comments:

  1. Hi, Ashish, a great job. But how to access the list of books under the heading "Dated: October 2017"? Plz help.

    ReplyDelete
    Replies
    1. Thanks for writing in! There are links to Google Drive provided on this pages:

      http://survival8.blogspot.in/2018/03/download-fiction-books-march-2018.html

      http://survival8.blogspot.in/p/download-self-help-books-may-2018.html

      Delete
    2. Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Full

      >>>>> Download LINK iQ

      Delete
    3. Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download Full

      >>>>> Download LINK Ce

      Delete
    4. Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Now

      >>>>> Download Full

      Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Full

      >>>>> Download LINK HC

      Delete
    5. Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download Now

      >>>>> Download Full

      Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download LINK

      >>>>> Download Now

      Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download Full

      >>>>> Download LINK tK

      Delete
    6. Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Now

      >>>>> Download Full

      Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Full

      >>>>> Download LINK qm

      Delete
    7. Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Now

      >>>>> Download Full

      Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Full

      >>>>> Download LINK oN

      Delete
    8. Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download Now

      >>>>> Download Full

      Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download Full

      >>>>> Download LINK db

      Delete
    9. Survival8: Google Drive Links Contributed By Book Club >>>>> Download Now

      >>>>> Download Full

      Survival8: Google Drive Links Contributed By Book Club >>>>> Download LINK

      >>>>> Download Now

      Survival8: Google Drive Links Contributed By Book Club >>>>> Download Full

      >>>>> Download LINK Il

      Delete
    10. Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download Now

      >>>>> Download Full

      Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download Full

      >>>>> Download LINK sK

      Delete
    11. Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download Now

      >>>>> Download Full

      Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download Full

      >>>>> Download LINK Km

      Delete
    12. Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download Now

      >>>>> Download Full

      Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download LINK

      >>>>> Download Now

      Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download Full

      >>>>> Download LINK IA

      Delete
    13. Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download Now

      >>>>> Download Full

      Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download Full

      >>>>> Download LINK kC

      Delete
    14. Survival8: Fiction Books (Nov 2018) >>>>> Download Now

      >>>>> Download Full

      Survival8: Fiction Books (Nov 2018) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Fiction Books (Nov 2018) >>>>> Download Full

      >>>>> Download LINK 5V

      Delete
    15. Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Now

      >>>>> Download Full

      Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Full

      >>>>> Download LINK G0

      Delete
    16. Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Now

      >>>>> Download Full

      Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Full

      >>>>> Download LINK hO

      Delete
    17. Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download Full

      >>>>> Download LINK 6j

      Delete
    18. Survival8: Hello World Chatbot Using Rasa >>>>> Download Now

      >>>>> Download Full

      Survival8: Hello World Chatbot Using Rasa >>>>> Download LINK

      >>>>> Download Now

      Survival8: Hello World Chatbot Using Rasa >>>>> Download Full

      >>>>> Download LINK ZR

      Delete
    19. Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download Full

      >>>>> Download LINK pA

      Delete
    20. Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Full

      >>>>> Download LINK bC

      Delete
    21. Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download Now

      >>>>> Download Full

      Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download Full

      >>>>> Download LINK In

      Delete
    22. Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download Full

      >>>>> Download LINK W7

      Delete
    23. Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download Now

      >>>>> Download Full

      Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download LINK

      >>>>> Download Now

      Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download Full

      >>>>> Download LINK 2M

      Delete
    24. Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download Now

      >>>>> Download Full

      Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download LINK

      >>>>> Download Now

      Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download Full

      >>>>> Download LINK 1B

      Delete
  2. Coronavirus is going to have huge impact on airline business. For example, check these news instances:
    "British Airways suspends more than 30,000 staff while Heathrow shuts one runway"

    "SpiceJet likely to lay off 1,000 employees"

    "Coronavirus effect: SpiceJet, GoAir cut March salary by up to 30%"

    "After IndiGo, GoAir, Vistara airlines cuts pay by about 10%"

    ReplyDelete
  3. Notebook looks clean and self explanatory. Have you worked on gold rate dataset???

    ReplyDelete
  4. couple of things to be considered
    1. under Programming PS,bastion cli
    2. configuration yaml, json
    3. secured infra secrets certificates store
    4. ci cd should contain azure devops

    ReplyDelete
  5. heroku git:remote -a world12
    heroku: Press any key to open up the browser to login or q to exit:
    Opening browser to https://cli-auth.heroku.com/auth/cli/browser/4a6a2917-6172-47a2-93d9-38bbe266e656?requestor=SFMyNTY.g2gDbQAAAA00NS4yNTIuNzMuMTY3bgYAz9g7znYBYg
    ABUYA.e6krJ_uBkOYryerq953gCPgIG8vBoAEfln2r28UxWIM
    heroku: Waiting for login... !
    » Error: timeout

    ReplyDelete
  6. horrible content. Stackoverflow is better.

    ReplyDelete
  7. Nonsense. You will write anything!

    ReplyDelete
  8. All Floors Australia knows that buying new flooring is a big investment in your home which involves making important choices. From our showrooms, to delivery to installation to maintenance, you can rely on All Floors Australia to help you every step of the way in creating the home of your dreams.
    Laminate Flooring in Werribee South

    ReplyDelete
  9. Thank you for sharing this informative blog about cipmox 500 mg tablet. It was really helpful. Visit All Day Med for cipmox 500 mg tablet online at best prices. It is one of the Best Online Drugstore in USA.

    ReplyDelete
  10. Thank you for sharing this informative blog about cipmox 500 mg tablet. It was really helpful. Visit All Day Med for cipmox 500 mg tablet online at best prices. It is one of the Best Online Drugstore in USA.

    ReplyDelete
  11. Great...It indeed is very clear to understand...Clear and self-explanatory code...

    I have a doubt, can I use Scrapy to crawl and then execute on cluster using Pyspark??

    ReplyDelete
  12. Thanks for sharing this amazing post this is the content i really looking for, its very helpful i hope you will continue your blogging anyway if anyone looking for python training institute in delhi contact us +91-9311002620 visit-https://www.htsindia.com/Courses/python/python-training-institute-in-delhi

    ReplyDelete
  13. Thanks for sharing this content its really a great post and very helpful thanks for sharing this knowledgeable content and if anyone looking for best java institute in delhi so contact here +91-9311002620 visit https://www.htsindia.com/java-training-courses

    ReplyDelete
  14. Excellent Blog, I like your blog and It is very informative. Thank you

    Pyspark online Training
    Learn Pyspark Online

    ReplyDelete
  15. Your spams are irritating, forced knowledge is dangerous. Keep half knowledge to yourself

    ReplyDelete
  16. Thank you for sharing and really fantastic,more useful for any one in the BI world to know.

    ReplyDelete
  17. Great Post!!! I got impressed more, thanks for sharing this information with us.
    Flask Course in Chennai
    Flask Training in Chennai

    ReplyDelete
  18. This comment has been removed by the author.

    ReplyDelete
  19. This comment has been removed by the author.

    ReplyDelete
  20. Hi there,

    Thank you so much for the post you do and also I like your post, Are you looking for Paroxetine-Paxil in UAE? We provide Paroxetine-Paxil, health bio-pharma online pharmacy, online pharmacy In Saudi Arabia, Buy Pills Online, High Rated Pills for Sale, Online Pharmacy Near you, Abortion Pills in Offer, Abortion pills Cytotec available in Dubai, Abortion Pills Cytotec available in Dubai, discount e pharmacy online in Riyadh, online e-pharmacy market In Bahrain, online pharmacy help Saudi Arabia, online pharmacy hub In Saudi Arabia, online pharmacy near me, online pharmacy stock, how to get online doctors prescriptions, anxiety pills prescribed, Is miferpristone and misoprostel available in UAE, Cytotec medicine in UAE,How to get abortion pills in UAE, Where to buy cytotec in Dubai, , Dubai online shopping tablets, Mifegest kit price online order, Vimax pills in Dubai, Cytotec 200 Mcg in Riffa, Cytotec 200 Mcg in , Misoprostol in Dubai pharmacy what are good anxiety pills, anxiety pills buy online in Saudi Arabia, anxiety pills best in Kuwait, for you with the well price and our services are very fast.
    Click here for href="https://onlineplanpharmacist.com/product/paroxetine-paxil/" />title" Paroxetine-Paxil|Buy Pills Online| High rated pills for sale| Online Pharmacy Near You" />MORE DETAILS......

    Contact Us: +1 (443) 718-9645
    Email Us At: support@healthsbiopharma.com

    ReplyDelete
  21. These modern laminate floors can be virtually indistinguishable from parquet. From elegant grey shades to featured intense dark colors with fabulous contrasting details. Your home can be a reflection of your family’s life and way of living.

    Laminate Flooring in Wyndham Vale

    ReplyDelete
  22. This comment has been removed by the author.

    ReplyDelete
  23. Keep Posting such Informative Posts. It was such an valuable information.
    Powerbi Read Soap

    ReplyDelete
  24. This comment has been removed by the author.

    ReplyDelete
  25. How I became a happy woman again
    With tears of joy and happiness I am giving out my testimony to all viewers online, my problem with Stomach Cancer stage IB and HIV has caused me many pains and sadness especially in my family.
    I was so afraid of loosing my life, I suffered the embarrassment of visiting
    therapy hundreds of times, unfortunately they did not find a definitive solution to my problem, I cried all day and night, do I have to live my life this way? I searched all true the internet for care, I was scammed by internet fraudsters times without numbers… until a friend of mine who stays in the UK introduced me to a friend of hers who was cured of the same disease, and she introduced me to Dr Itua who cured her from Breast Cancer by this email/WhatsApp +2348149277967, drituaherbalcenter@gmail.com I contacted him and he promised that all will be fine and I had faith.He sent me his herbal medicines through Courier service and I was instructed on how to drink it for three weeks to cure,I followed the instructions given to me and Today am a happy woman again. He cures all kinds of diseases.

    ReplyDelete
  26. In such scenarios applications like vst crack comes in handy in order to keep your system in shape.








    ReplyDelete
  27. delta 8 winston salem iHemp have been serving High Point and Winston-Salem since Feb 2019. We strive not only to give you great CBD products, but also great customer service. The unique thing about us.

    ReplyDelete
  28. Merci votre avis, pour moi
    un des livres de négociation préféré, résumé:

    https://marketingcrea.com/chris-voss-negociation-ne-coupez-jamais-la-poire-en-deux-livre-never-split-the-difference-audiobook-resume-fr-ebook-pdf-tahl-raz/

    ReplyDelete
  29. This post on the basic is so intriguing and can be of so much use to students or people who are interested in this field, if you wat you can also check out more information on data science course in bangalore

    ReplyDelete
  30. Thank you for providing this blog really appreciate the efforts taken by you for the same, if you want you can check out
    data science course in bangalore
    data science course

    ReplyDelete
  31. idm crack works in a very simple way, as do most apps of this kind.








    ReplyDelete
  32. Thanks for posting these kinds of post its very helpful and very good content a really appreciable post apart from that if anyone looking for C++ training institute in delhi so contact here +91-9311002620 visit

    ReplyDelete
  33. Nice. Also check vicks cough drops from medplusmart at vicks cough drops

    ReplyDelete
  34. This comment has been removed by the author.

    ReplyDelete
  35. I think you forgot Flink. it's also one of the best, youtube is the best resource to learn flink.
    Thanks & Regards
    Venu
    apache spark triaining institute in Hyderabad

    ReplyDelete
  36. I really appreciate your information which you shared with us. If anyone who want to create his/her career in python So Contact Here-+91-9311002620 Or Visit our website https://www.htsindia.com/Courses/python/python-training-institute-in-delhi

    ReplyDelete
  37. I have read a few of the articles on your website now, and I really like your style of blogging. I added it to my favorites blog site list and will be checking back soon. Please check out my site as well and let me know what you think. debt negotiation services

    ReplyDelete
  38. You’ve got some interesting points in this article. I would have never considered any of these if I didn’t come across this. Thanks!. debt negotiation services

    ReplyDelete
  39. Great survey, I'm sure you're getting a great response. lunettes de soleil homme havaianas

    ReplyDelete
  40. debt agreement vs bankruptcy Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic. If possible, as you gain expertise, would you mind updating your blog with extra information? It is extremely helpful for me.

    ReplyDelete
  41. Nice Blog
    Great Information.
    Spa Course
    #makeupCourse #NutritionCourse #HairCourse #SpaCourse #CosmetologyCourse #NailCourse #AestheticsSkinCourse

    ReplyDelete
  42. I dint get you?
    I am getting similar issue


    Details: "ADO.NET: Python script error.
    Traceback (most recent call last):
    File "PythonScriptWrapper.PY", line 2, in
    import os, pandas, matplotlib
    File "C:\Users\bbaby\Anaconda3\lib\site-packages\matplotlib\__init__.py", line 174, in
    _check_versions()
    File "C:\Users\bbaby\Anaconda3\lib\site-packages\matplotlib\__init__.py", line 159, in _check_versions
    from . import ft2font
    ImportError: DLL load failed while importing ft2font: The specified module could not be found.
    "

    ReplyDelete
  43. Remote GenexDB DBA services allow businesses to outsource the administration of their database platforms. "Remote" refers to services provided remotely by a company
    A third-party company provides remote DBA services, which monitor and administer the designated database server installations
    https://genexdbs.com/

    ReplyDelete
  44. Amazing Post you have shared with us. To get more detailed information about Emotional Intelligence then Visit CMX Chat Rooms latest Article.
    Keep sharing.

    ReplyDelete
  45. This comment has been removed by the author.

    ReplyDelete
  46. nice blog
    Great Information.
    #makeupCourse #NutritionCourse #HairCourse #SpaCourse #CosmetologyCourse #NailCourse #AestheticsCourse
    Spa Course

    ReplyDelete
  47. Usually I do not read post on blogs, but I would like to say that this write-up very forced me to try and do it! Your writing style has been surprised me. Great work admin.Keep update more blog.Visit here for Product Engineering Services | Product Engineering Solutions.

    ReplyDelete
  48. Hi there,

    Thank you so much for the post you do and also I like your post, Are you looking for a High-Quality magic shroom supply in the whole USA? We are providing High-Quality magic shroom supply, buy phenazepam, shroom supply, golden mammoth mushroom, Malabar coast mushrooms, golden mammoth shroom, bad trip stopper, buy shrooms online in the USA,phenazepam buy,mckennaii strain,cubensis pf red boy, treasure coast mushrooms, shrooms for sale, half oz shrooms,cbdirective, psilocybe cubensis Vietnam, treasure coast shroom,mckennaii cubensis, shroom supply reviews, brazil magic mushroom with the good price and our services are very fast.

    Click here for Contact +1 (765) 351-5231‬, Email: magicmushroomsales@gmail.com

    ReplyDelete
  49. How do soft skills help you and your workforce?

    Soft Skills are the new official skills for the workforce. They are the skills that help make an individual a star performer and products a valuable resource. They include the classic hard skills of marketing and technical knowledge, but also encompass things like interpersonal skills and even emotional intelligence. how Soft skills training can improve your workforce and make your company a better, more productive place.

    ReplyDelete
  50. I don't have time beating around the bush, instead I go straight to the point.... So to you doubters I ain't expecting you all to believe my testimony but only the few chosen ones by God. In a short summary, I'm here to tell the whole world that I recently got cured from my long term herpes disease, both the HSV1 and HSV2 through the assistance of Herbalist doctor Oyagu I pray God continually blesses Dr Oyagu in all he does, because he is indeed a very good, nice and powerful doctor. I’m cured of herpes disease at last! Wow I'm so much in great joy because I've never in my life believed herbs works, but meeting doctor Oyagu was an eye opener and he made me believe that herpes truly got a complete cure. I used the doctor's herbal medicine for just two weeks and I was totally cured from both my HSV1 and HSV2. I'm so excited. For help and assistance in getting rid of your herpes virus you can Call/WhatsApp doctor Oyagu on his telephone number: +2348101755322 or for more inquiries you can as well contact the doctor on EMAIL: oyaguherbalhome@gmail.com

    ReplyDelete
  51. Thank for this great summary guys I just love it. Can you please make summary on What to Say When You Talk to Yourself and The Power Of Now

    please please

    ReplyDelete
  52. Thank for this great summary guy I just love it. Can you please make summary on What to Say When You Talk to Yourself and The Power Of Now

    please please

    ReplyDelete
  53. Great and informative Content thanks for sharing with us. Keep it up still spirits turbo 500

    ReplyDelete
  54. I just want to say thank you sir. your summaries make my life easier wish you can upload Atomic Habits Summary sir.

    Thank you

    ReplyDelete
  55. Do you know who originally wrote this fable about the donkey and the tiger?

    ReplyDelete
  56. Thanks for the informative Content. I learned a lot here. Keep sharing more like this.
    Marketing Cloud in Salesforce
    Salesforce Cloud Marketing

    ReplyDelete
  57. Thanks for sharing content and such nice information for me. I hope you will share some more content about. Please keep sharing! Designing and Implementing an Azure AI Solution course AI-100

    ReplyDelete
  58. cake thc disposable We are Tennessee’s first and only CBD/Hemp Dispensary owned and managed by a Pharmacist and staffed by Medical Professionals certified in Cannabinoid Pharmacotherapy. We offer an environment that is warm and inviting.

    ReplyDelete
  59. This is a really informative knowledge, Thanks for posting this informative Information. Microsoft Certified Azure Fundamentals

    ReplyDelete
  60. Hi there,

    Thank you so much for the post you do and also I like your post, Are you looking for a High-Quality Gorilla Glue hash in the whole USA? We are providing High-Quality Gorilla Glue hash, Derb and terpys live resin, Lucky charm tins, Hoggin dabs live resin, Nova carts, Zombie kush hash, Chronopoly carts, Concrete farms, Jeeter juice carts, Bulldog Amsterdam hash,2020 moonrock pre Rolls, Mad Labs carts, Primal cartridge, Ketama gold hash, Pure one carts, Glo extract bulk with the good price and our services are very fast.

    Click here for Contact +1(415) 534-5674, Email: info@qualitythcportals.com

    ReplyDelete
  61. This comment has been removed by the author.

    ReplyDelete
  62. I am really very happy to visit your blog. Directly I am found which I truly need. please visit our website for more information
    Top 5 Best Open Source Web Scraping Framework Tools In 2022


    ReplyDelete
  63. First and foremost, ingenuity inlighten cradling swing cover replacement are planned in light of your child. There is an exhaustive comprehension of the way that your child needs to look adorable, decent, and huggable. It is likewise perceived that your child should be protected
    and agreeable in those Hello Kitty clothing.

    ReplyDelete
  64. This comment has been removed by the author.

    ReplyDelete
  65. I like your all post. You have done really good work. Thank you for the information you provide, it helped me a lot. crackbay.org I hope to have many more entries or so from you.
    Very interesting blog.
    Tor Browser Crack

    ReplyDelete
  66. Nice blog! Thanks for sharing such an informative blog. The way you express your views are so easy to understand.
    Visit our website:
    Pressure Washing service in Boulder
    Gutter Maintenance services in Boulder
    Yard Cleaning service in Boulder
    Window Cleaning services in Boulder

    ReplyDelete
  67. Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  68. Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: Screw It, Let’S Do It (By Richard Branson) - 15 Minutes Long Summary >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  69. Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Now

    >>>>> Download Full

    Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  70. Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download Now

    >>>>> Download Full

    Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download LINK

    >>>>> Download Now

    Survival8: One Hot Encoding From Pyspark, Pandas, Category Encoders And Sklearn >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  71. Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Now

    >>>>> Download Full

    Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  72. Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Now

    >>>>> Download Full

    Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  73. Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download Now

    >>>>> Download Full

    Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Never Argue With A Fool (Donkey And Tiger Fable) >>>>> Download Full

    >>>>> Download LINK

    ReplyDelete
  74. Hey, you used to write wonderful. Maybe you can write next articles referring to this article. I desire to read more things about it! Best of luck for the next! Please visit my web site Journeyessence.com. Best Enneagram nz service provider.

    ReplyDelete
  75. An arteriovenous (AV) fistula is an abnormal connection between an artery and a vein wherein blood flows at once from an artery into a vein, bypassing some capillaries. Consult to Dr. Vikas Kathuria- The Best Arteriovenous Fistula Treatment doctor in Delhi India.

    ReplyDelete

  76. Very Informative and creative contents. This concept is a good way to enhance the knowledge. thanks for sharing.
    Continue to share your knowledge through articles like these, and keep posting more blogs.
    And more Information Data scraping service in Australia

    ReplyDelete
  77. It is really great and nice article. I read this and it is very helpful for us.

    ReplyDelete
  78. Superb. I really enjoyed very much with this article here. Really it is an amazing article I had ever read. I hope it will help a lot for all. Thank you so much for this amazing posts and please keep update like this excellent article.thank you for sharing such a great blog with us. expecting for your.
    Genuine Cosmetic Products Delivery in Gurugram
    Genuine Baby Products Delivery in Gurugram

    ReplyDelete
  79. Find the Best Syphilis Treatment in Delhi, Dr. Sablok is an Syphilis Treatment Doctor in India who treats with herbal medicines

    ReplyDelete
  80. The Indian media fraternity comprises of several components. These include newspapers, magazines, tabloids, TV, radio and the internet. Kashmir Genocide

    ReplyDelete
  81. Superbly written article, if only all bloggers offered the same content as you, the internet would be a far better place.. information

    ReplyDelete
  82. Survival8: Google Drive Links Contributed By Book Club >>>>> Download Now

    >>>>> Download Full

    Survival8: Google Drive Links Contributed By Book Club >>>>> Download LINK

    >>>>> Download Now

    Survival8: Google Drive Links Contributed By Book Club >>>>> Download Full

    >>>>> Download LINK a6

    ReplyDelete
  83. Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download Now

    >>>>> Download Full

    Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Technology Listing Related To Full Stack Development (Jan 2020) >>>>> Download Full

    >>>>> Download LINK FO

    ReplyDelete
  84. Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download Now

    >>>>> Download Full

    Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Negotiation Genius (Deepak Malhotra, Max Bazerman, 2008) >>>>> Download Full

    >>>>> Download LINK ay

    ReplyDelete
  85. Survival8: Getting Started With Spark On Ubuntu In Virtualbox >>>>> Download Now

    >>>>> Download Full

    Survival8: Getting Started With Spark On Ubuntu In Virtualbox >>>>> Download LINK

    >>>>> Download Now

    Survival8: Getting Started With Spark On Ubuntu In Virtualbox >>>>> Download Full

    >>>>> Download LINK SU

    ReplyDelete
  86. Dicsinnovatives in Delhi is one of the most reputed institution offering specialized digital marketing course in pitampura, Delhi. with 100% Placement ;Digital marketing institute in pitampura, Join now dicsinnovatives EMI Available. Enroll Now. Training.100+ Hiring Partners. Expert-Led Online Course. Industry Expert Faculty

    ReplyDelete
  87. Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download Now

    >>>>> Download Full

    Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download LINK

    >>>>> Download Now

    Survival8: Reading A Json File From The Google Drive In The Google Colab >>>>> Download Full

    >>>>> Download LINK YO

    ReplyDelete
  88. Survival8: Fiction Books (Nov 2018) >>>>> Download Now

    >>>>> Download Full

    Survival8: Fiction Books (Nov 2018) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Fiction Books (Nov 2018) >>>>> Download Full

    >>>>> Download LINK ZJ

    ReplyDelete
  89. Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download Now

    >>>>> Download Full

    Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Beginner Issues While Working With Hadoop And Spark (May 2020) >>>>> Download Full

    >>>>> Download LINK dB

    ReplyDelete
  90. Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Now

    >>>>> Download Full

    Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Bargaining For Advantage. Negotiation Strategies For Reasonable People (G. Richard Shell, 2E, 2006) >>>>> Download Full

    >>>>> Download LINK cQ

    ReplyDelete
  91. Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Now

    >>>>> Download Full

    Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Fundamentals Of Delta Lake (Databricks) >>>>> Download Full

    >>>>> Download LINK bO

    ReplyDelete
  92. Survival8: Hello World Chatbot Using Rasa >>>>> Download Now

    >>>>> Download Full

    Survival8: Hello World Chatbot Using Rasa >>>>> Download LINK

    >>>>> Download Now

    Survival8: Hello World Chatbot Using Rasa >>>>> Download Full

    >>>>> Download LINK vv

    ReplyDelete
  93. Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: Teach Your Child How To Think (Edward De Bono) - Summary >>>>> Download Full

    >>>>> Download LINK gh

    ReplyDelete
  94. Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: Emotional Intelligence - Why It Can Matter More Than Iq (Daniel Goleman, 2009) - Summary >>>>> Download Full

    >>>>> Download LINK ut

    ReplyDelete
  95. Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: How To Talk To Anyone (92 Little Tricks For Big Success In Relationships, By Leil Lowndes) - Book Summary >>>>> Download Full

    >>>>> Download LINK oT

    ReplyDelete
  96. Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: Secrets To Winning At Office Politics (Marie Mcintyre, 2005) - Summary >>>>> Download Full

    >>>>> Download LINK l5

    ReplyDelete
  97. Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Now

    >>>>> Download Full

    Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Elbow Method For Identifying K In Kmeans (Clustering) And Knn (Classification) >>>>> Download Full

    >>>>> Download LINK 9y

    ReplyDelete
  98. Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download Now

    >>>>> Download Full

    Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Emotional Intelligence. Harvard Business Review. (Summary) >>>>> Download Full

    >>>>> Download LINK YY

    ReplyDelete
  99. Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download Now

    >>>>> Download Full

    Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download LINK

    >>>>> Download Now

    Survival8: Never Split The Difference (Chris Voss) - Summary >>>>> Download Full

    >>>>> Download LINK Ub

    ReplyDelete
  100. Survival8: Getting To Yes (Negotiating Agreement Without Giving In) By Roger Fisher And William Ury >>>>> Download Now

    >>>>> Download Full

    Survival8: Getting To Yes (Negotiating Agreement Without Giving In) By Roger Fisher And William Ury >>>>> Download LINK

    >>>>> Download Now

    Survival8: Getting To Yes (Negotiating Agreement Without Giving In) By Roger Fisher And William Ury >>>>> Download Full

    >>>>> Download LINK uq

    ReplyDelete
  101. Survival8: Installing Rasa Using Yml File In Anaconda >>>>> Download Now

    >>>>> Download Full

    Survival8: Installing Rasa Using Yml File In Anaconda >>>>> Download LINK

    >>>>> Download Now

    Survival8: Installing Rasa Using Yml File In Anaconda >>>>> Download Full

    >>>>> Download LINK ST

    ReplyDelete
  102. Survival8: Unsupervised Outlier Detection Using Pyod >>>>> Download Now

    >>>>> Download Full

    Survival8: Unsupervised Outlier Detection Using Pyod >>>>> Download LINK

    >>>>> Download Now

    Survival8: Unsupervised Outlier Detection Using Pyod >>>>> Download Full

    >>>>> Download LINK zK

    ReplyDelete
  103. Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download Now

    >>>>> Download Full

    Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download LINK

    >>>>> Download Now

    Survival8: Intelligent Investor (Ben Graham And Jason Zweig, 4E) >>>>> Download Full

    >>>>> Download LINK v7

    ReplyDelete
  104. Thank you for sharing such detailed Blog. I am learning a lot from you. Visit my website to get best Information About Top IAS coaching Institutes in Dadar
    Top IAS coaching Institutes in Dadar
    Best IAS coaching Institutes in Dadar

    ReplyDelete
  105. Hello everyone out there, I'm here to give my testimony about a herbalist doctor who helped me. I was infected with HERPES SIMPLEX VIRUS in 2011, I went to many hospitals to heal myself but there was no solution, so I was thinking how I can get a solution so that my body can be well. One day I was in the river thinking about where I can go to get a solution. so a lady walked towards me telling me why I'm so sad and I open everything by telling her my problem, she told me she could help me, she introduced me to a doctor who uses herbal medicines to cure the SIMPLEX HERPES VIRUS and gave me your email, so I sent you an email. He told me everything I had to do and also gave me instructions to take, which I followed correctly. Before I knew what was happening after two weeks, the SIMPLEX HERPES VIRUS that was in my body disappeared. therefore, if you also have a broken heart and need help, you can also send an email to {dr.joshuaherbalhome6@gmail.com} or whatsapp him on +2347048515927 Contact him today and he will have a testimony ... Good luck!

    Dr. JOSHUA also cures:
    1. HIV / AIDS
    2. HERPES 1/2
    3. CANCER
    4. ALS (Lou Gehrig's disease)
    5. Hepatitis B
    6. chronic pancreatic
    7. emphysema
    8. COPD (chronic obstructive pulmonary disease)

    ReplyDelete
  106. Your blog is awfully appealing. I am contented with your post. I regularly read your blog and its very helpful.

    Negotiation Strategies

    ReplyDelete
  107. Your massinge is so help full

    ReplyDelete
  108. Thanks for sharing this useful Blog. Venta cytotec madrid Comprar Cytotec Misoprostol en Madrid. Compra ahora tu Kit de pastillas abortivas en Madrid. Cytotec precio

    ReplyDelete


PYTHON CODE:

from time import time
from bs4 import BeautifulSoup
from urllib.request import urlopen
from pyspark import SparkContext

sc = SparkContext()

start_time = time()

url_list_path = '/my_hdfs/links.csv'

urls_lines = sc.textFile(url_list_path)

def processRecord(url):
	if len(url) > 0:
		print(url)
		page = urlopen(url)
		soup = BeautifulSoup(page, features="lxml")
		rtnVal = soup.prettify()
	else:
		url = "NA"
		rtnVal = "NA"
	return [url, "rtnVal"]

temp = urls_lines.map(processRecord)

temp_rdd = temp.collect()

for elem in temp_rdd:
	print(elem)

print("Time taken: " + str(time() - start_time))

--- --- --- --- ---
		
Checking the connectivity of machines on the cluster:

(base) administrator@master:~/Desktop$ ping slave1
  PING slave1 (192.168.1.3) 56(84) bytes of data.
  64 bytes from slave1 (192.168.1.3): icmp_seq=1 ttl=64 time=0.307 ms
  64 bytes from slave1 (192.168.1.3): icmp_seq=2 ttl=64 time=0.208 ms
  64 bytes from slave1 (192.168.1.3): icmp_seq=3 ttl=64 time=0.181 ms
  ^C
  --- slave1 ping statistics ---
  8 packets transmitted, 8 received, 0% packet loss, time 7140ms
  rtt min/avg/max/mdev = 0.181/0.216/0.307/0.035 ms, pipe 4 

(base) administrator@master:~/Desktop$ ping slave2
  PING slave2 (192.168.1.4) 56(84) bytes of data.
  From master (192.168.1.12) icmp_seq=1 Destination Host Unreachable
  From master (192.168.1.12) icmp_seq=2 Destination Host Unreachable
  From master (192.168.1.12) icmp_seq=3 Destination Host Unreachable
  --- slave2 ping statistics ---
  498 packets transmitted, 106 received, +381 errors, 78.7149% packet loss, time 508825ms
  rtt min/avg/max/mdev = 0.126/22.646/1701.778/176.475 ms, pipe 4 

# # #

STARTING AND STOPPING THE HADOOP CLUSTER

(base) administrator@master:~/Desktop$ cd $HADOOP_HOME
(base) administrator@master:/usr/local/hadoop$ ls
bin  data  etc  include  lib  libexec  LICENSE.txt  logs  NOTICE.txt  README.txt  sbin  share  tmp

(base) administrator@master:/usr/local/hadoop$ cd sbin

(base) administrator@master:/usr/local/hadoop/sbin$ ls
distribute-exclude.sh  httpfs.sh                start-all.cmd      start-dfs.sh         stop-all.cmd      stop-dfs.sh         workers.sh
FederationStateStore   kms.sh                   start-all.sh       start-secure-dns.sh  stop-all.sh       stop-secure-dns.sh  yarn-daemon.sh
hadoop-daemon.sh       mr-jobhistory-daemon.sh  start-balancer.sh  start-yarn.cmd       stop-balancer.sh  stop-yarn.cmd       yarn-daemons.sh
hadoop-daemons.sh      refresh-namenodes.sh     start-dfs.cmd      start-yarn.sh        stop-dfs.cmd      stop-yarn.sh 
(base) administrator@master:/usr/local/hadoop/sbin$ stop-all.sh
WARNING: Stopping all Apache Hadoop daemons as administrator in 10 seconds.
WARNING: Use CTRL-C to abort.
Stopping namenodes on [master]
Stopping datanodes
Stopping secondary namenodes [master]
Stopping nodemanagers
Stopping resourcemanager
(base) administrator@master:/usr/local/hadoop/sbin$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as administrator in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [master]
Starting datanodes
Starting secondary namenodes [master]
Starting resourcemanager
Starting nodemanagers
(base) administrator@master:/usr/local/hadoop/sbin$ jps
1521 SecondaryNameNode
1284 NameNode
2232 Jps
1913 ResourceManager
(base) administrator@master:/usr/local/hadoop/sbin$ 

# # #

OPEN A NEW TERMINAL FOR SLAVE1

(base) administrator@master:/usr/local/hadoop/sbin$ ssh slave1
Welcome to Ubuntu 19.10 (GNU/Linux 5.3.0-40-generic x86_64)

Last login: Tue Oct 15 18:12:50 2019 from 192.168.1.12
(base) administrator@slave1:~$ jps
2752 Jps
2561 NodeManager
2395 DataNode

# # #

SLAVE2

(base) administrator@master:/usr/local/hadoop/sbin$ ssh slave2
Welcome to Ubuntu 19.10 (GNU/Linux 5.3.0-40-generic x86_64)

Last login: Tue Oct 15 18:13:19 2019 from 192.168.1.12
(base) administrator@slave2:~$ jps
24675 DataNode
24810 NodeManager
25038 Jps
(base) administrator@slave2:~$ 

# # #

STARTING SPARK

(base) administrator@master:/usr/local/hadoop/sbin$ cd /usr/local/spark/sbin
(base) administrator@master:/usr/local/spark/sbin$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as administrator in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [master]
master: namenode is running as process 1284.  Stop it first.
Starting datanodes
slave2: datanode is running as process 24675.  Stop it first.
slave1: datanode is running as process 2395.  Stop it first.
Starting secondary namenodes [master]
master: secondarynamenode is running as process 1521.  Stop it first.
Starting resourcemanager
resourcemanager is running as process 1913.  Stop it first.
Starting nodemanagers
slave2: nodemanager is running as process 24810.  Stop it first.
slave1: nodemanager is running as process 2561.  Stop it first.

# # #

RUNNING THE PI CALCULATION PROGRAM

(base) administrator@master:/usr/local/spark/sbin$ ../bin/spark-submit --master yarn ../examples/src/main/python/pi.py 100

LOGS FOR ISSUE WHEN PYTHON IS NOT FOUND:

2020-03-18 13:05:12,467 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, slave2, executor 2): java.io.IOException: Cannot run program "python": error=2, No such file or directory
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
	...
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: error=2, No such file or directory
	at java.lang.UNIXProcess.forkAndExec(Native Method)
	... 19 more

FIX:
PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ../bin/spark-submit --master yarn ../examples/src/main/python/pi.py 100

...

LOGS:
(base) administrator@master:/usr/local/spark/sbin$ PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ../bin/spark-submit --master yarn ../examples/src/main/python/pi.py 100
2020-03-18 13:11:01,025 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-03-18 13:11:01,590 INFO spark.SparkContext: Running Spark version 2.4.4
2020-03-18 13:11:01,610 INFO spark.SparkContext: Submitted application: PythonPi
2020-03-18 13:11:01,654 INFO spark.SecurityManager: Changing view acls to: administrator
2020-03-18 13:11:01,655 INFO spark.SecurityManager: Changing modify acls to: administrator
2020-03-18 13:11:01,655 INFO spark.SecurityManager: Changing view acls groups to: 
2020-03-18 13:11:01,655 INFO spark.SecurityManager: Changing modify acls groups to: 
2020-03-18 13:11:01,655 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(administrator); groups with view permissions: Set(); users  with modify permissions: Set(administrator); groups with modify permissions: Set()
2020-03-18 13:11:01,886 INFO util.Utils: Successfully started service 'sparkDriver' on port 41485.
2020-03-18 13:11:01,905 INFO spark.SparkEnv: Registering MapOutputTracker
2020-03-18 13:11:01,921 INFO spark.SparkEnv: Registering BlockManagerMaster
2020-03-18 13:11:01,924 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2020-03-18 13:11:01,924 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2020-03-18 13:11:01,931 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-ede610d7-77c2-4c7f-8f33-86dbfd7aae35
2020-03-18 13:11:01,945 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
2020-03-18 13:11:02,010 INFO spark.SparkEnv: Registering OutputCommitCoordinator
2020-03-18 13:11:02,079 INFO util.log: Logging initialized @2123ms
2020-03-18 13:11:02,146 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2020-03-18 13:11:02,171 INFO server.Server: Started @2216ms
2020-03-18 13:11:02,197 INFO server.AbstractConnector: Started ServerConnector@742f7a8a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-03-18 13:11:02,197 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
2020-03-18 13:11:02,231 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79d2f0e3{/jobs,null,AVAILABLE,@Spark}
...
2020-03-18 13:11:02,274 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@60aa8307{/stages/stage/kill,null,AVAILABLE,@Spark}
2020-03-18 13:11:02,277 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040
2020-03-18 13:11:03,139 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.1.12:8032
2020-03-18 13:11:03,308 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers
2020-03-18 13:11:03,349 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
2020-03-18 13:11:03,350 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
...
2020-03-18 13:11:30,243 INFO impl.YarnClientImpl: Submitted application application_1584516372808_0002
2020-03-18 13:11:30,245 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1584516372808_0002 and attemptId None
2020-03-18 13:11:31,251 INFO yarn.Client: Application report for application_1584516372808_0002 (state: ACCEPTED)
2020-03-18 13:11:31,254 INFO yarn.Client: 
	 client token: N/A
	 diagnostics: AM container is launched, waiting for AM container to Register with RM
	 ApplicationMaster host: N/A
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1584517290226
	 final status: UNDEFINED
	 tracking URL: http://master:8088/proxy/application_1584516372808_0002/
	 user: administrator
2020-03-18 13:11:32,261 INFO yarn.Client: Application report for application_1584516372808_0002 (state: ACCEPTED)
...
2020-03-18 13:11:53,345 INFO yarn.Client: Application report for application_1584516372808_0002 (state: ACCEPTED)
2020-03-18 13:11:54,112 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> master, PROXY_URI_BASES -> http://master:8088/proxy/application_1584516372808_0002), /proxy/application_1584516372808_0002
2020-03-18 13:11:54,114 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
2020-03-18 13:11:54,219 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2020-03-18 13:11:54,348 INFO yarn.Client: Application report for application_1584516372808_0002 (state: RUNNING)
2020-03-18 13:11:54,348 INFO yarn.Client: 
	 client token: N/A
	 diagnostics: N/A
	 ApplicationMaster host: 192.168.1.3
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1584517290226
	 final status: UNDEFINED
	 tracking URL: http://master:8088/proxy/application_1584516372808_0002/
	 user: administrator
2020-03-18 13:11:54,350 INFO cluster.YarnClientSchedulerBackend: Application application_1584516372808_0002 has started running.
2020-03-18 13:11:54,357 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33747.
2020-03-18 13:11:54,358 INFO netty.NettyBlockTransferService: Server created on master:33747
2020-03-18 13:11:54,359 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2020-03-18 13:11:54,381 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, master, 33747, None)
2020-03-18 13:11:54,385 INFO storage.BlockManagerMasterEndpoint: Registering block manager master:33747 with 366.3 MB RAM, BlockManagerId(driver, master, 33747, None)
2020-03-18 13:11:54,421 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, master, 33747, None)
2020-03-18 13:11:54,422 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, master, 33747, None)
2020-03-18 13:11:54,535 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
2020-03-18 13:11:54,536 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@593eb20{/metrics/json,null,AVAILABLE,@Spark}
2020-03-18 13:11:54,561 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
2020-03-18 13:11:54,894 INFO internal.SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/usr/local/spark/sbin/spark-warehouse').
2020-03-18 13:11:54,894 INFO internal.SharedState: Warehouse path is 'file:/usr/local/spark/sbin/spark-warehouse'.
2020-03-18 13:11:54,901 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL.
2020-03-18 13:11:54,902 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62ba741b{/SQL,null,AVAILABLE,@Spark}
2020-03-18 13:11:54,902 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/json.
2020-03-18 13:11:54,902 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ca66444{/SQL/json,null,AVAILABLE,@Spark}
2020-03-18 13:11:54,903 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/execution.
2020-03-18 13:11:54,903 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c462761{/SQL/execution,null,AVAILABLE,@Spark}
2020-03-18 13:11:54,903 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/execution/json.
2020-03-18 13:11:54,904 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@505e4e78{/SQL/execution/json,null,AVAILABLE,@Spark}
2020-03-18 13:11:54,904 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /static/sql.
2020-03-18 13:11:54,905 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c100bc2{/static/sql,null,AVAILABLE,@Spark}
2020-03-18 13:11:55,487 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
2020-03-18 13:11:55,745 INFO spark.SparkContext: Starting job: reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44
2020-03-18 13:11:55,761 INFO scheduler.DAGScheduler: Got job 0 (reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44) with 100 output partitions
2020-03-18 13:11:55,762 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44)
2020-03-18 13:11:55,763 INFO scheduler.DAGScheduler: Parents of final stage: List()
2020-03-18 13:11:55,764 INFO scheduler.DAGScheduler: Missing parents: List()
2020-03-18 13:11:55,776 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44), which has no missing parents
2020-03-18 13:11:55,910 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.2 KB, free 366.3 MB)
2020-03-18 13:11:55,939 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.2 KB, free 366.3 MB)
2020-03-18 13:11:55,944 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on master:33747 (size: 4.2 KB, free: 366.3 MB)
2020-03-18 13:11:55,948 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
2020-03-18 13:11:55,976 INFO scheduler.DAGScheduler: Submitting 100 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))
2020-03-18 13:11:55,977 INFO cluster.YarnScheduler: Adding task set 0.0 with 100 tasks
2020-03-18 13:11:57,222 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.1.3:55594) with ID 1
2020-03-18 13:11:57,246 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, slave1, executor 1, partition 0, PROCESS_LOCAL, 7863 bytes)
2020-03-18 13:11:57,344 INFO storage.BlockManagerMasterEndpoint: Registering block manager slave1:45231 with 366.3 MB RAM, BlockManagerId(1, slave1, 45231, None)
2020-03-18 13:11:57,560 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on slave1:45231 (size: 4.2 KB, free: 366.3 MB)
2020-03-18 13:11:59,464 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.1.4:34664) with ID 2
2020-03-18 13:11:59,466 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, slave2, executor 2, partition 1, PROCESS_LOCAL, 7863 bytes)
2020-03-18 13:11:59,590 INFO storage.BlockManagerMasterEndpoint: Registering block manager slave2:37809 with 366.3 MB RAM, BlockManagerId(2, slave2, 37809, None)
2020-03-18 13:11:59,813 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on slave2:37809 (size: 4.2 KB, free: 366.3 MB)
2020-03-18 13:12:02,080 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, slave2, executor 2, partition 2, PROCESS_LOCAL, 7863 bytes)
2020-03-18 13:12:02,088 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 2622 ms on slave2 (executor 2) (1/100)
...
2020-03-18 13:12:07,817 INFO scheduler.TaskSetManager: Finished task 99.0 in stage 0.0 (TID 99) in 107 ms on slave1 (executor 1) (100/100)
2020-03-18 13:12:07,818 INFO cluster.YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool 
2020-03-18 13:12:07,819 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44) finished in 12.004 s
2020-03-18 13:12:07,828 INFO scheduler.DAGScheduler: Job 0 finished: reduce at /usr/local/spark/sbin/../examples/src/main/python/pi.py:44, took 12.082149 s
Pi is roughly 3.141856
2020-03-18 13:12:07,845 INFO server.AbstractConnector: Stopped Spark@742f7a8a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-03-18 13:12:07,847 INFO ui.SparkUI: Stopped Spark web UI at http://master:4040
2020-03-18 13:12:07,851 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
2020-03-18 13:12:07,874 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
2020-03-18 13:12:07,875 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
2020-03-18 13:12:07,881 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
2020-03-18 13:12:07,882 INFO cluster.YarnClientSchedulerBackend: Stopped
2020-03-18 13:12:07,892 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2020-03-18 13:12:07,911 INFO memory.MemoryStore: MemoryStore cleared
2020-03-18 13:12:07,912 INFO storage.BlockManager: BlockManager stopped
2020-03-18 13:12:07,924 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
2020-03-18 13:12:07,926 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2020-03-18 13:12:07,933 INFO spark.SparkContext: Successfully stopped SparkContext
2020-03-18 13:12:08,883 INFO util.ShutdownHookManager: Shutdown hook called
2020-03-18 13:12:08,884 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b20bb994-38bd-48e1-becb-25d4de1796df
2020-03-18 13:12:08,885 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-911ce211-570f-444e-9034-52163db026c1
2020-03-18 13:12:08,891 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-911ce211-570f-444e-9034-52163db026c1/pyspark-ffb02600-4911-433a-9ac5-92b318f833ac
(base) administrator@master:/usr/local/spark/sbin$ 

# # #

CHECKING THE CLUSTER REPORT IN THE BROWSER

# # # COMMAND 1 FOR STANDALONE MODE: PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ./bin/spark-submit --master local /home/administrator/Desktop/spark_script_2.py 100 COMMAND 2 FOR YARN CLUSTER MODE: PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ./bin/spark-submit --master yarn /home/administrator/Desktop/spark_script_2.py 100 ISSUE IN RUNNING THESE COMMANDS WHEN THE FILE IS IN THE UBUNTU FILE SYSTEM: 2020-03-18 13:33:27,982 INFO spark.SparkContext: Created broadcast 0 from textFile at NativeMethodAccessorImpl.java:0 Traceback (most recent call last): File "/home/administrator/Desktop/spark_script_2.py", line 29, in [[ pre ]] temp_rdd = temp.collect() File "/usr/local/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 816, in collect File "/usr/local/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ File "/usr/local/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe. : org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://master:9000/home/administrator/Desktop/links.csv FIX: PUT THE FILE IN HDFS (base) administrator@master:~/Desktop$ hdfs dfs -copyFromLocal /home/administrator/Desktop/links.csv /my_hdfs 2020-03-18 13:37:35,721 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false (base) administrator@master:~/Desktop$ hdfs dfs -ls /my_hdfs Found 3 items -rw-r--r-- 1 administrator supergroup 159358 2019-10-21 15:06 /my_hdfs/Dummy_1000.csv -rw-r--r-- 1 administrator supergroup 48097 2019-10-17 16:26 /my_hdfs/Feed.csv -rw-r--r-- 1 administrator supergroup 7307 2020-03-18 13:37 (base) administrator@master:~/Desktop$ hdfs dfs -rm /my_hdfs/links.csv Deleted /my_hdfs/links.csv (base) administrator@master:~/Desktop$ hdfs dfs -copyFromLocal /home/administrator/Desktop/links.csv /my_hdfs 2020-03-18 13:37:35,721 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false LOGS FOR STANDALONE MODE: (base) administrator@master:/usr/local/spark$ PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ./bin/spark-submit --master local /home/administrator/Desktop/spark_script_2.py 100 2020-03-18 13:44:53,661 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2020-03-18 13:44:54,318 INFO spark.SparkContext: Running Spark version 2.4.4 2020-03-18 13:44:54,343 INFO spark.SparkContext: Submitted application: spark_script_2.py ... 2020-03-18 13:44:54,642 INFO util.Utils: Successfully started service 'sparkDriver' on port 43343. ... 2020-03-18 13:44:54,884 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. ... 2020-03-18 13:44:54,931 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 ... 2020-03-18 13:44:55,078 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36171. 2020-03-18 13:44:55,079 INFO netty.NettyBlockTransferService: Server created on master:36171 ... 2020-03-18 13:44:55,968 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 238.2 KB, free 366.1 MB) 2020-03-18 13:44:56,023 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.3 KB, free 366.0 MB) 2020-03-18 13:44:56,026 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on master:36171 (size: 23.3 KB, free: 366.3 MB) ... 2020-03-18 13:44:56,546 INFO mapred.FileInputFormat: Total input paths to process : 1 2020-03-18 13:44:56,632 INFO spark.SparkContext: Starting job: collect at /home/administrator/Desktop/spark_script_2.py:30 2020-03-18 13:44:56,653 INFO scheduler.DAGScheduler: Got job 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) with 1 output partitions 2020-03-18 13:44:56,653 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) ... 2020-03-18 13:44:56,861 INFO rdd.HadoopRDD: Input split: hdfs://master:9000/my_hdfs/links.csv:0+7353 https://survival8.blogspot.com/p/imas-101-xml-and-java-based-framework.html https://survival8.blogspot.com/p/blog-page_87.html https://survival8.blogspot.com/p/hello-world-application-using-spring.html https://survival8.blogspot.com/p/debugging-spring-mvc-application-in.html ... https://survival8.blogspot.com/p/windows-cmd-tip-findstr-grep-for-windows.html https://survival8.blogspot.com/p/windows-cmd-tips-jan-2020.html 2020-03-18 13:47:38,288 INFO python.PythonRunner: Times: total = 161397, boot = 328, init = 161, finish = 160908 2020-03-18 13:47:38,316 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 10352 bytes result sent to driver 2020-03-18 13:47:38,327 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 161562 ms on localhost (executor driver) (1/1) 2020-03-18 13:47:38,330 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 2020-03-18 13:47:38,333 INFO python.PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 41711 2020-03-18 13:47:38,341 INFO scheduler.DAGScheduler: ResultStage 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) finished in 161.631 s 2020-03-18 13:47:38,350 INFO scheduler.DAGScheduler: Job 0 finished: collect at /home/administrator/Desktop/spark_script_2.py:30, took 161.717769 s ['https://survival8.blogspot.com/p/imas-101-xml-and-java-based-framework.html', 'rtnVal'] ['https://survival8.blogspot.com/p/blog-page_87.html', 'rtnVal'] ['https://survival8.blogspot.com/p/hello-world-application-using-spring.html', 'rtnVal'] ... ['https://survival8.blogspot.com/p/windows-cmd-tip-handling-files-with.html', 'rtnVal'] ['https://survival8.blogspot.com/p/windows-cmd-tip-findstr-grep-for-windows.html', 'rtnVal'] ['https://survival8.blogspot.com/p/windows-cmd-tips-jan-2020.html', 'rtnVal'] Time taken: 162.96060037612915 2020-03-18 13:47:38,417 INFO spark.SparkContext: Invoking stop() from shutdown hook 2020-03-18 13:47:38,466 INFO server.AbstractConnector: Stopped Spark@18d0477{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2020-03-18 13:47:38,470 INFO spark.ContextCleaner: Cleaned accumulator 14 2020-03-18 13:47:38,471 INFO ui.SparkUI: Stopped Spark web UI at http://master:4040 2020-03-18 13:47:38,479 INFO spark.ContextCleaner: Cleaned accumulator 19 2020-03-18 13:47:38,509 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 2020-03-18 13:47:38,520 INFO memory.MemoryStore: MemoryStore cleared 2020-03-18 13:47:38,521 INFO storage.BlockManager: BlockManager stopped 2020-03-18 13:47:38,523 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 2020-03-18 13:47:38,525 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 2020-03-18 13:47:38,533 INFO spark.SparkContext: Successfully stopped SparkContext 2020-03-18 13:47:38,533 INFO util.ShutdownHookManager: Shutdown hook called 2020-03-18 13:47:38,534 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-5c98e403-d484-44e1-927d-c78f3dda2e58/pyspark-7ab4a0d0-6e22-4ba8-9f4f-bce9d7e43d68 2020-03-18 13:47:38,540 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3df1288a-12f7-408f-a2b7-30530ebbbe4b 2020-03-18 13:47:38,549 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-5c98e403-d484-44e1-927d-c78f3dda2e58 LOGS FOR YARN CLUSTER MODE: (base) administrator@master:/usr/local/spark$ PYSPARK_PYTHON=/home/administrator/anaconda3/bin/python ./bin/spark-submit --master yarn /home/administrator/Desktop/spark_script_2.py 100 2020-03-18 13:48:37,919 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2020-03-18 13:48:38,627 INFO spark.SparkContext: Running Spark version 2.4.4 2020-03-18 13:48:38,650 INFO spark.SparkContext: Submitted application: spark_script_2.py ... 2020-03-18 13:48:38,949 INFO util.Utils: Successfully started service 'sparkDriver' on port 39709. ... 2020-03-18 13:48:39,256 INFO server.AbstractConnector: Started ServerConnector@3da82867{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2020-03-18 13:48:39,257 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. ... 2020-03-18 13:48:39,329 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 2020-03-18 13:48:40,037 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.1.12:8032 2020-03-18 13:48:40,219 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers ... 2020-03-18 13:48:44,651 INFO yarn.Client: Uploading resource file:/tmp/spark-8dd4f42f-7776-4fe9-8da5-58dcffd0df83/__spark_libs__7387525767004520734.zip -> hdfs://master:9000/user/administrator/.sparkStaging/application_1584516372808_0003/__spark_libs__7387525767004520734.zip 2020-03-18 13:49:05,556 INFO yarn.Client: Uploading resource file:/usr/local/spark/python/lib/pyspark.zip -> hdfs://master:9000/user/administrator/.sparkStaging/application_1584516372808_0003/pyspark.zip 2020-03-18 13:49:05,672 INFO yarn.Client: Uploading resource file:/usr/local/spark/python/lib/py4j-0.10.7-src.zip -> hdfs://master:9000/user/administrator/.sparkStaging/application_1584516372808_0003/py4j-0.10.7-src.zip 2020-03-18 13:49:05,839 INFO yarn.Client: Uploading resource file:/tmp/spark-8dd4f42f-7776-4fe9-8da5-58dcffd0df83/__spark_conf__4358898555039276238.zip -> hdfs://master:9000/user/administrator/.sparkStaging/application_1584516372808_0003/__spark_conf__.zip 2020-03-18 13:49:05,952 INFO spark.SecurityManager: Changing view acls to: administrator 2020-03-18 13:49:05,953 INFO spark.SecurityManager: Changing modify acls to: administrator 2020-03-18 13:49:05,953 INFO spark.SecurityManager: Changing view acls groups to: 2020-03-18 13:49:05,953 INFO spark.SecurityManager: Changing modify acls groups to: 2020-03-18 13:49:05,953 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(administrator); groups with view permissions: Set(); users with modify permissions: Set(administrator); groups with modify permissions: Set() 2020-03-18 13:49:06,923 INFO yarn.Client: Submitting application application_1584516372808_0003 to ResourceManager 2020-03-18 13:49:06,961 INFO impl.YarnClientImpl: Submitted application application_1584516372808_0003 2020-03-18 13:49:06,963 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1584516372808_0003 and attemptId None 2020-03-18 13:49:07,974 INFO yarn.Client: Application report for application_1584516372808_0003 (state: ACCEPTED) 2020-03-18 13:49:07,979 INFO yarn.Client: client token: N/A diagnostics: AM container is launched, waiting for AM container to Register with RM ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1584519546940 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1584516372808_0003/ user: administrator 2020-03-18 13:49:08,983 INFO yarn.Client: Application report for application_1584516372808_0003 (state: ACCEPTED) ... 2020-03-18 13:49:21,028 INFO yarn.Client: Application report for application_1584516372808_0003 (state: ACCEPTED) 2020-03-18 13:49:21,813 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> master, PROXY_URI_BASES -> http://master:8088/proxy/application_1584516372808_0003), /proxy/application_1584516372808_0003 2020-03-18 13:49:21,815 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill. 2020-03-18 13:49:21,952 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM) 2020-03-18 13:49:22,032 INFO yarn.Client: Application report for application_1584516372808_0003 (state: RUNNING) 2020-03-18 13:49:22,033 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: 192.168.1.4 ApplicationMaster RPC port: -1 queue: default start time: 1584519546940 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1584516372808_0003/ user: administrator 2020-03-18 13:49:22,036 INFO cluster.YarnClientSchedulerBackend: Application application_1584516372808_0003 has started running. 2020-03-18 13:49:22,044 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44709. ... 2020-03-18 13:49:22,856 INFO mapred.FileInputFormat: Total input paths to process : 1 2020-03-18 13:49:22,917 INFO spark.SparkContext: Starting job: collect at /home/administrator/Desktop/spark_script_2.py:30 2020-03-18 13:49:22,945 INFO scheduler.DAGScheduler: Got job 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) with 2 output partitions 2020-03-18 13:49:22,946 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) 2020-03-18 13:49:22,946 INFO scheduler.DAGScheduler: Parents of final stage: List() 2020-03-18 13:49:22,948 INFO scheduler.DAGScheduler: Missing parents: List() 2020-03-18 13:49:22,954 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (PythonRDD[2] at collect at /home/administrator/Desktop/spark_script_2.py:30), which has no missing parents 2020-03-18 13:49:22,993 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 6.4 KB, free 366.0 MB) 2020-03-18 13:49:22,996 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 4.1 KB, free 366.0 MB) 2020-03-18 13:49:22,998 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on master:44709 (size: 4.1 KB, free: 366.3 MB) 2020-03-18 13:49:22,998 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161 2020-03-18 13:49:23,027 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (PythonRDD[2] at collect at /home/administrator/Desktop/spark_script_2.py:30) (first 15 tasks are for partitions Vector(0, 1)) 2020-03-18 13:49:23,028 INFO cluster.YarnScheduler: Adding task set 0.0 with 2 tasks 2020-03-18 13:49:24,805 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.1.4:50874) with ID 1 2020-03-18 13:49:24,832 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, slave2, executor 1, partition 0, RACK_LOCAL, 7907 bytes) 2020-03-18 13:49:24,992 INFO storage.BlockManagerMasterEndpoint: Registering block manager slave2:37211 with 366.3 MB RAM, BlockManagerId(1, slave2, 37211, None) 2020-03-18 13:49:25,296 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on slave2:37211 (size: 4.1 KB, free: 366.3 MB) 2020-03-18 13:49:25,481 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on slave2:37211 (size: 23.3 KB, free: 366.3 MB) 2020-03-18 13:49:35,521 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.1.3:54206) with ID 2 2020-03-18 13:49:35,530 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, slave1, executor 2, partition 1, NODE_LOCAL, 7907 bytes) 2020-03-18 13:49:35,657 INFO storage.BlockManagerMasterEndpoint: Registering block manager slave1:45921 with 366.3 MB RAM, BlockManagerId(2, slave1, 45921, None) 2020-03-18 13:49:35,912 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on slave1:45921 (size: 4.1 KB, free: 366.3 MB) 2020-03-18 13:49:36,058 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on slave1:45921 (size: 23.3 KB, free: 366.3 MB) 2020-03-18 13:50:14,953 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 50137 ms on slave2 (executor 1) (1/2) 2020-03-18 13:50:14,959 INFO python.PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 39749 2020-03-18 13:50:19,138 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 43614 ms on slave1 (executor 2) (2/2) 2020-03-18 13:50:19,144 INFO scheduler.DAGScheduler: ResultStage 0 (collect at /home/administrator/Desktop/spark_script_2.py:30) finished in 56.151 s 2020-03-18 13:50:19,148 INFO cluster.YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool 2020-03-18 13:50:19,155 INFO scheduler.DAGScheduler: Job 0 finished: collect at /home/administrator/Desktop/spark_script_2.py:30, took 56.237616 s ['https://survival8.blogspot.com/p/imas-101-xml-and-java-based-framework.html', 'rtnVal'] ['https://survival8.blogspot.com/p/blog-page_87.html', 'rtnVal'] ['https://survival8.blogspot.com/p/hello-world-application-using-spring.html', 'rtnVal'] ... ['https://survival8.blogspot.com/p/windows-cmd-tip-handling-files-with.html', 'rtnVal'] ['https://survival8.blogspot.com/p/windows-cmd-tip-findstr-grep-for-windows.html', 'rtnVal'] ['https://survival8.blogspot.com/p/windows-cmd-tips-jan-2020.html', 'rtnVal'] Time taken: 56.83036684989929 2020-03-18 13:50:19,225 INFO spark.SparkContext: Invoking stop() from shutdown hook 2020-03-18 13:50:19,236 INFO server.AbstractConnector: Stopped Spark@3da82867{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2020-03-18 13:50:19,239 INFO ui.SparkUI: Stopped Spark web UI at http://master:4040 2020-03-18 13:50:19,243 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread 2020-03-18 13:50:19,274 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors 2020-03-18 13:50:19,275 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 2020-03-18 13:50:19,282 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 2020-03-18 13:50:19,283 INFO cluster.YarnClientSchedulerBackend: Stopped 2020-03-18 13:50:19,289 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 2020-03-18 13:50:19,301 INFO memory.MemoryStore: MemoryStore cleared 2020-03-18 13:50:19,301 INFO storage.BlockManager: BlockManager stopped 2020-03-18 13:50:19,303 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 2020-03-18 13:50:19,306 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 2020-03-18 13:50:19,311 INFO spark.SparkContext: Successfully stopped SparkContext 2020-03-18 13:50:19,311 INFO util.ShutdownHookManager: Shutdown hook called 2020-03-18 13:50:19,312 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-8dd4f42f-7776-4fe9-8da5-58dcffd0df83/pyspark-94e28c99-29d8-4860-8a99-009b7308e307 2020-03-18 13:50:19,314 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-8dd4f42f-7776-4fe9-8da5-58dcffd0df83 2020-03-18 13:50:19,316 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-39870413-d0d5-471e-a4f5-31c4f0ce5b2b (base) administrator@master:/usr/local/spark$ --- --- --- --- ---

No comments:

Post a Comment