Is It Mandatory to Start Hadoop to Run Spark Application | Hadoop Interview Questions and Answers
acadgild.com/big-data/big-dat...
Welcome back to Apache Spark interview questions and answers powered by Acadgild. Here in this video, Mr. Sudhanshu, a Data Scientist, will explain you the Hadoop interview questions and answers specifically on apache spark. If you have missed the master video of interview question and answers kindly, click the following link.
Top 20 Apache Spark Interview Questions - • Top 20 Apache Spark In...
In this video the mentor will explain, how Apache spark store data. So, here is the interview question,
Is It Mandatory to Start Hadoop to Run Spark Application?
No! Spark supports a standard mode as well as a cluster mode. So we can configure and execute our spark program into windows machine as well as in an individual Linux machine or mac machine on top of a cluster where we will be using the bunch of machine but in terms of storage there is no separate storage in spark.
We can load the data from local system and process it, Hadoop or HDFS is not mandatory to run spark application
Thank you for watching the video. Please like, comment and subscribe the channel for more videos.
For more updates on courses and tips follow us on:
Facebook: / acadgild
Twitter: / acadgild
LinkedIn: / acadgild
Негізгі бет Ғылым және технология Is It Mandatory to Start Hadoop to Run Spark Application | Hadoop Interview Questions and Answers
Пікірлер