Follow Us On

Youtube Google Plus linkedintwitterfacebook

Analytics And Big Data

heading

last update : 14/04/2016

Big Data and Hadoop Development Training

Event Date Country City Days Price  
    No upcoming event date found      
noc

 

Agenda:

Phase 1: Hadoop 2.0 Fundamentals (12 Hours)

Big Data What is Big Data Dimensions of Big Data Big Data in Advertising Big Data in Banking Big Data in Telecom Big Data in eCommerce Big Data in Healthcare Big Data in Defense Processing options of Big Data Hadoop as an option

Hadoop

What is Hadoop How Hadoop 1.0 Works How Hadoop 2.0 Works HDFS MapReduce What is YARN How YARN Works Advantages of YARN How Hadoop has an edge Hadoop Ecosystem SqoopOoziePigHiveFlume

Hadoop Hands On

Running HDFS commandsRunning your MapReduce program on Hadoop 1.0Running your MapReduce Program on Hadoop 2.0Running Sqoop Import and Sqoop ExportCreating Hive tables directly from SqoopCreating Hive tablesQuerying Hive tables

Evaluation Test

Bonus:

Setting up Hadoop 1.0 on a single node cluster manualSetting up Hadoop 2.0 on a single node setup manualMultinode setup walkthrough manual

Phase 2: Hadoop Development (8 hours)

Advanced MapReduceMapReduce Code WalkthroughToolRunnerMR UnitDistributed CacheCombinerPartitionerSetup and Cleanup methodsUsing Java API to access HDFS

Joins Using MapReduce

Map Side joinsReduce side joins

Custom Types

Input Types in MapReduceOutput Types in MapReduceCustom Input Data typesCustom Input Data typesCustom Output Data typesMultiple Reducer MR programZero Reducer Mapper Program

Advanced MapReduce Hands On

MR Unit hands onDistributed Cache hands onPartitioner hands onCombiner hands onAccessing files using HDFS API hands onMap Side joins hands onReduce side joins hands on

MapReduce Design Patterns:

SearchingSortingFilteringInverted IndexTF-IDFWord Co-occurrence

MapReduce Design Patterns Hands On:

Distributed GrepBloom FiltersAverage CalculationStandard DeviationMapSide joinsReduce Side joins

Evaluation Test (30 marks)

Phase 3: Other Hadoop Development Aspects- Pig, Hive, Oozie and Impala (8 hours)

Pig

What is PigHow Pig WorksSimple processing using PigAdvanced Processing Using PigPig Hands On

Hive

What is HiveHow Hive WorksSimple processing using HiveAdvanced processing using HiveHive hands-on

Oozie

What is OozieHow Oozie WorksOozie hands-on

Impala

What is ImpalaHow Impala WorksWhere Impala is better than HiveImpala’s shortcomingsImpala hands-on

Evaluation Test

 

 

Benefits:

 

From the course:  
  • Understand Big Data and the various types of data stored in Hadoop
  • Understand the fundamentals of MapReduce, Hadoop Distributed File System (HDFS), YARN, and how to write MapReduce code
  • Learn best practices and considerations for Hadoop development, debugging techniques and implementation of workflows and common algorithms
  • Learn how to leverage Hadoop frameworks like ApachePig™, ApacheHive™, Sqoop, Flume, Oozie and other projects from the Apache Hadoop Ecosystem
  • Understand optimal hardware configurations and network considerations for building out, maintaining and monitoring your Hadoop cluster
  • Learn advanced Hadoop API topics required for real-world data analysis
  • Understand the path to ROI with Hadoop
  From the workshop:  
  • High quality training from an industry expert
  • 24 hours of comprehensive training 
  • Earn 24 PDUs
  • Course Completion Certificates
  • 50% interactive and hands-on training exercises using HDFS, Pig, Hive, HBase, key MapReduce components and features, and more

Who can attend: 

  • Architects and developers who design, develop and maintain Hadoop-based solutions
  • Data Analysts, BI Analysts, BI Developers,  SAS Developers and related profiles who analyze Big Data in Hadoop environment
  • Consultants who are actively involved in a Hadoop Project
  • Experienced Java software engineers who need to understand and develop Java MapReduce applications for Hadoop 2.0.

 

Submit your details to download the brochure:

First Name *:

Last Name *:

Email *:

Phone Number *:

Job Title:

Organisation:

Comments:

  Type the characters you see in the picture below *:

 

 

FAQs:

How will a course on this technology benefit me?

Hadoop is the buzzword. It is now viewed as a game changer in the field of analytics expected to support thousands of jobs in the future. • AT&T Interactive, Sears, PayPal, AOL, Deloitte, IBM and other big players are looking to hire Hadoop engineers. • Randstad states that the annual pay hikes for Analytics professionals in India is on an average 50% more than other IT professionals. • Also, according to Robert Half Technology “Big Data, Big Pay – Average salary can reach up to $154,250”. These statistics are indeed very promising and KnowledgeHut’s course on Big Data and Hadoop 2.0 will help you skill up and perform data analysis as per the expectations of your employers.

How do I know if I am eligible for this course?

Architects and developers who design, develop and maintain Hadoop-based solutions, Data Analysts, BI Analysts, BI Developers, SAS Developers, and Consultants involved in Hadoop-based projects will greatly benefit from this course.

Is there any provision for group discounts for these training programs?

Yes, we do offer group packages for the training programs. Mail us at contact@unicomlearning.com to know more about group concessions.

What is Big Data and Hadoop?

Hadoop is considered as the most effective data platform for companies working with big data, and is an integral part of storing, handling and retrieving enormous amounts of data in a variety applications. Hadoop enables you to run deep analytics which cannot be effectively handled by a database engine. Big enterprises around the world have learnt Hadoop to be a game changer in their Big Data management, and as more companies embrace this powerful technology the demand for Hadoop Developers is also increasing. By learning how to harness the power of Hadoop 2.0 to manipulate, analyse and perform computations on Big Data, you will be paving the way for an enriching and financially rewarding career as an expert Hadoop developer.

Are there any prerequisites for attending the training?

Big Data and Hadoop training does not require any special qualification or experience.

What certifications will I get after course completion?

We offer PDU certificates and course completions certificates to candidates after they successfully complete the course. You can earn up to 24 hours of PDU certificate after attending the training.

How can I make payment?

Payment can be made via Cheque / DD / Online Funds transfer / Cash Payment.

Cheque should be drawn in favour of "Unicom training and Seminars Pvt Ltd" payable at Bangalore

NEFT Payment:

Account Name: UNICOM Training & Seminars Pvt LtdBank Name : State Bank of IndiaBank Address: Ground Floor, K V Plaza, Green Glen Layout, Outer Ring Road, Bangalore.A/c Number : 31729010535IFSC : SBIN0012706A/c Type: Current

Whom to Contact for More info ?

Write to Mr. Nitin at nitin@unicomseminars.org

QUICK QUERY

 
navigation div
navigation div

Contact Us(India)

Shanmugha Arcade,

3rd Floor, 39,

NGEF Lane,

Indira Nagar 1st Stage,

Bengaluru - 560038,

Karnataka, India.

Telephone: +91-9538878795, +91-9538878799, +91-8025257962

E: contact@unicomlearning.com

Contact Us(UK)

OptiRisk R&D House

One Oxford Road

Uxbridge

Middlesex

UB9 4DA

UNITED KINGDOM

E: contact@unicom.co.uk

© 2018 All Rights Reserved