Hadoop Apache Pig Execution Types

Hadoop Apache Pig Execution Types

Execution Types

Pig has two EXECUTION TYPES or run modes, local and Hadoop (currently called MapReduce)

  1. Local mode
  2. Map Reduce mode.

1) Local mode

  •  In Local mode, pig runs in a single JVM and accesses the local file system.
  •  This mode is suitable only for small data sets and typing out pig.
  •  To run in local mode, set the option to local i.e

%pig-x local
Grant>

Interested in mastering MapReduce? Enroll now for a FREE demo on MapReduce training

2) Map Reduce Mode

  • In Map Reduce mode, pig translates queries into map reduce jobs and runs them on a Hadoop cluster and cluster may be a pseudo or fully distributed cluster.
  • To use Map Reduce mode, we have to check the version of pig that is compatible with the versions of Hadoop
  • Pig honors the HADOOP-HOME environmental variable for finding which Hadoop client to run and a pig will also use a bundled copy of the Hadoop libraries.
  • For pig, we have to set two properties in pig properties file which is pig’s cong directory.

i.e for pseudo distributed setup, an example is

fs.default.name=hdfs://localhost/
mapped. job. tracker=local host:8021

Once you have configured pig to connect to Hadoop cluster, you can launch pig, setting the  -x option to map reduce or a map-reduce mode is default i.e. % pig grunt>

 MindMajix YouTube Channel

Running Pig Programs

This section shows you how to run Pig in local mode, using the Grunt shell, a Pig script, and an embedded program.

There are three ways of executing pig programs which work in both local and map reduce mode.

1. Script-pig can run a script file that contains pig commands

2. Grunt – Is an interactive shell for running pig commands

3. Embedded – you can run pig programs from Java using the pig server class, much like you can use JDBC to run SQL Programs from JAVA.

Modes of Execution in Pig

You can execute Pig Latin statements.

  1. Grunt shell or command line
  2. Local mode or Map Reduce mode
  3. Either Interactively or in a batch

Local mode and MR or HDFS Mode

In the local mode execution of pig, we expect all the inputs from the local environment (input from the local mode and produce the output from the local mode but should not be from HDFS)

Syntax:- Pig-x local.

 

In MR mode execution of pig, we expect all input files from HDFS path and also produce the output on top of hdfs only

Syntax:- Pig(or)

Grunt shell:

The grunt mode can also be called an interactive mode. Grunt is pig’s interactive shell. It is started when no file is specified for a pig to run.

Grunt shell is an interactive base shell where you will expect the o/p then and there only, irrespective of the input.

Script mode Execution:

 In script mode, pig runs the commands specified in a script file.

 Here, we will describe all transformations in a single file which ends with pig. All the commands will be executed one after another which are there in .pig file.

Local                                                           Map Reduce
Pig- x local abc.pig                                     Pig abc.pig

Embedded mode:

In this mode, we will do the pig customization code if at all sane analyzer functionality is not archived through the user-defined transformations.

Then we can write Java code to archive the same and reference the same Java code (.jar file) in our pig script by making use of the code.

“Register xyz.jar”

Note:- Register keyword should be the first statement within our pig script code.

Explore MapReduce Sample Resumes! Download & Edit, Get Noticed by Top Employers!Download Now!

List of Big Data Courses:

 Hadoop Administration MapReduce
 Big Data On AWS Informatica Big Data Integration
 Bigdata Greenplum DBA Informatica Big Data Edition
 Hadoop Hive Impala
 Hadoop Testing Apache Mahout

Job Support Program

Online Work Support for your on-job roles.

jobservice

Our work-support plans provide precise options as per your project tasks. Whether you are a newbie or an experienced professional seeking assistance in completing project tasks, we are here with the following plans to meet your custom needs:

  • Pay Per Hour
  • Pay Per Week
  • Monthly
Learn MoreGet Job Support
Course Schedule
NameDates
Hadoop TrainingNov 19 to Dec 04View Details
Hadoop TrainingNov 23 to Dec 08View Details
Hadoop TrainingNov 26 to Dec 11View Details
Hadoop TrainingNov 30 to Dec 15View Details
Last updated: 04 Apr 2023
About Author

 

Technical Content Writer

read less