top of page
BlogPageTop

Installing Apache Spark and Scala (Mac)


In this Spark Scala tutorial you will learn,

  • How to install Apache Spark on Mac OS.

  • By the end of this tutorial you will be able to run Apache Spark with Scala on Mac machine.

  • You will also download Eclispe for Scala IDE.


To install Apache Spark on windows machine visit this.


 

Installing Homebrew

  1. You will be installing Apache Spark using Homebrew.

  2. So install Homebrew if you don’t have it, visit: https://brew.sh/ and copy paste the command on your terminal and run it.


 

Installing Apache Spark

  1. Open terminal and type command brew install apache-spark and hit Enter.

  2. Create a log4j.properties file. Type cd /usr/local/Cellar/apache-spark/2.3.1/libexec/conf and hit Enter. Please change the version according to your downloaded version. Spark 2.3.1 is the version installed for me.

  3. Type cp log4j.properties.template log4j.properties and hit Enter.

  4. Edit the log4j.properties file and change the log level from INFO to ERROR on log4j.rootCategory. We are just changing the log level from INFO to ERROR only.




Download Scala IDE

  • Install the Scala IDE from here.


  • Open the IDE once, just to check if it's running fine. You should see panels like this;


 

Test it out!

  1. Open terminal and go to the directory where apache-spark was installed to (such as cd /usr/local/Cellar/apache-spark/2.3.1/libexec/) and then type ls to get a directory listing.

  2. Look for a text file, like README.md or CHANGES.txt.

  3. Type command spark-shell and hit Enter.

  4. At this point you should see a scala> prompt. If not, double check the steps above.

  5. Type val rdd = sc.textFile("README.md") or whatever text file you’ve found and hit Enter. You have just created a rdd of readme text file. Now type rdd.count() and hit Enter to count the number of lines in text file.

  6. You should get a count of number of lines in that file! Congratulations, you just ran your first Spark program! Don't worry about the commands, I will explain them.




Sample Execution



You’ve got everything set up! If you have any question please don't forget to mention in the comments section below.






Navigation menu

1. Apache Spark and Scala Installation

2. Getting Familiar with Scala IDE

3. Spark data structure basics

4. Spark Shell

5. Reading data files in Spark

6. Writing data files in Spark

7. Spark streaming

1 commentaire


raveena raveena
raveena raveena
20 févr. 2020

<a href="https://www.rainbowtraininginstitute.com/big-data-and-hadoop/apache-spark-and-scala-online-training">Spark and Scala Online Training</a>

Thanks

J'aime

Want to share your thoughts about this blog?

Disclaimer: Please note that the information provided on this website is for general informational purposes only and should not be taken as legal advice. Dataneb is a platform for individuals to share their personal experiences with visa and immigration processes, and their views and opinions may not necessarily reflect those of the website owners or administrators. 

 

While we strive to keep the information up-to-date and accurate, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability, or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk. 

 

We strongly advise that you consult with a qualified immigration attorney or official government agencies for any specific questions or concerns related to your individual situation. We are not responsible for any losses, damages, or legal disputes arising from the use of information provided on this website. 

 

By using this website, you acknowledge and agree to the above disclaimer and Google's Terms of Use and Privacy Policy.

bottom of page