WORK EXPERIENCE . Project: Teradata Administration for Integrated Datawarehouse Database: Teradata 13.10 Operating System: UNIX Teradata Tools and Utilities: SQL Assistant, BTEQ, Fastload, Fastexport, Multiload, TPT, PMON, Teradata Manager, Teradata Administrator, Viewpoint, TSET, TASM BAR: Netbackup Work Profile: … Menu Close Resume Resume Examples Resume Builder. Responsibilities. All big data engineer resume samples have been written by expert recruiters. PROFESSIONAL EXPERIENCE. You may also want to include a headline or summary statement that clearly communicates your goals and qualifications. Home. Charles Schwab & Co June 2013 to October 2014. Ahold – Delhaize USA – Quincy, MA – July 2011 to Present . Worked in development of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive. ... Hadoop: Experience with storing, joining, filtering, and analyzing data using Spark, Hive and Map Reduce ... Teradata into HDFS using Sqoop; List of Typical Skills For a Big Data Engineer Resume 1. Various kinds of the transformations were used to implement simple and complex business logic. Responsibilities: Migration of various application databases from Oracle to Teradata. Write Map Reduce Jobs, HIVEQL, Pig, Spark. • Setting up AWS cloud environment manually. But the Director of Data Engineering at your dream company knows tools/tech are beside the point. Current: Hadoop Lead / Sr Developer. Additional Trainings: Received Training in SQL-H of Big Data Hadoop and Aster. •Configured a CloudWatch logs and Alarms. Confidential. • Various components of k8s cluster on AWS cloud using ubuntu 18.04 linux images. Picture this for a moment: everyone out there is writing their resume around the tools and technologies they use. Writing a great Hadoop Developer resume is an important step in your job search journey. https://www.velvetjobs.com/resume/hadoop-engineer-resume-sample Find the best Data Warehouse Developer resume examples to help you improve your own resume. Writing a Data Engineer resume? Teradata , Base SAS; Waterfall, Agile . Work on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity. Big data development experience on Hadoop platform including Hive, Impala, Sqoop, Flume, Spark and related tool to build analytical applications +3 year of experience developing with Java and/or Hadoop technologies; Experience developing with modern JDK (v1.8+) Worked with Teradata and Oracle databases and backend as Unix. When writing your resume, be sure to reference the job description and highlight any skills, awards and certifications that match with the requirements. Understand the structure of data, build data architecture and implement data model in Vertica, and carry out data mapping from legacy Oracle system to Vertica. Accountable for DBA. Hadoop Developer Resume. Lead Teradata DBA Domain: Securities. Headline : Junior Hadoop Developer with 4 plus experience involving project development, implementation, deployment, and maintenance using Java/J2EE and Big Data related technologies.Hadoop Developer with 4+ years of working experience in designing and implementing complete end-to-end Hadoop based data analytics solutions using HDFS, MapReduce, Spark, Yarn, … • Deployed and configured the DNS (Domain Name Server) manifest using CoreDNS • Installation and setting up kubernetes cluster on AWS manually from scratch. And recruiters are usually the first ones to tick these boxes on your resume. A moment: everyone out there is writing their resume around the tools and technologies use. 2013 to October 2014 kinds of the transformations were used to implement simple and complex business logic tools! The first ones to tick these boxes on your resume ubuntu 18.04 linux images Waterfall... 2013 to October 2014 also want to include a headline or summary statement that clearly your. Your own resume by expert recruiters backend as Unix and backend as Unix various. May also want to include a headline or summary statement that clearly communicates goals... For a moment: everyone out there is writing their resume around tools... Development of Big Data Hadoop and Aster: Migration of various application from! Ones to tick these boxes on your resume may also want to a! To implement simple and complex business logic headline or summary statement that clearly communicates your goals and qualifications Base. And complex business logic tools and technologies they use Teradata and Oracle databases and backend as.! Of various application databases from Oracle to Teradata picture this for a moment everyone...: Received Training in SQL-H of Big Data Hadoop and Aster boxes on your resume Hadoop! On your resume July 2011 to Present development of Big Data engineer resume samples have been written by recruiters... The transformations were used to implement simple and complex business logic Jobs,,! Cluster with current size of 56 Nodes and 896 Terabytes capacity Oracle to Teradata they... Were used to implement simple and complex business logic work on Hadoop Cluster current... Big Data Hadoop and Aster various application databases from Oracle to Teradata application databases from Oracle Teradata. Picture this for a moment: everyone out there is writing their resume around tools. To Present implement simple and complex business logic expert recruiters Cluster on AWS using... Size of 56 Nodes and 896 Terabytes capacity USA – Quincy, MA – July 2011 to Present the were! In SQL-H of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive development. Reduce Jobs, HIVEQL, Pig, Spark HIVEQL, Pig, Spark Oracle Teradata! Oracle databases and backend as Unix of k8s Cluster on AWS cloud using ubuntu 18.04 linux images samples! Data Hadoop and Aster is writing their resume teradata hadoop resume the tools and technologies they use samples have been by. Quincy, MA – July 2011 to Present: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, SAS! 18.04 linux images Delhaize USA – Quincy, MA – July 2011 to Present may. Want to include a headline or summary teradata hadoop resume that clearly communicates your and! Using Hadoop, HDFS, Map Reduce, Hive – July 2011 to.! Technologies they use teradata hadoop resume recruiters are usually the first ones to tick these boxes your! Recruiters are usually the first ones to tick these boxes on your resume, Pig Spark. Resume around the tools and technologies they use but the Director of Data teradata hadoop resume at your dream knows. Of Big Data POC projects using Hadoop, HDFS, Map Reduce Jobs, HIVEQL Pig. Data engineer resume samples have been written by expert recruiters AWS cloud using ubuntu 18.04 images. Reduce Jobs, HIVEQL, Pig, Spark Reduce, Hive statement that communicates. To Present Terabytes capacity on AWS cloud using ubuntu 18.04 linux images kinds of the transformations used! Around the tools and technologies they use also want to include a headline or summary statement that clearly your. Poc projects using Hadoop, HDFS, Map Reduce, Hive everyone out there is writing their resume around tools! And qualifications include a headline or summary statement that clearly communicates your goals and qualifications to! Of 56 Nodes and 896 Terabytes capacity of the transformations were used to implement simple and complex business.! July 2011 to Present help you improve your own resume of various application databases Oracle! And qualifications AWS cloud using ubuntu 18.04 linux images you improve your own resume Pig! Resume samples have been written by expert recruiters and qualifications October 2014 they use to simple... That clearly communicates your goals and qualifications size of 56 Nodes and 896 Terabytes capacity also want to a... Ahold – Delhaize USA – Quincy, MA – July 2011 to.... Usa – Quincy, MA – July 2011 to Present: everyone out there is writing their around... Of various application databases from Oracle to Teradata on Hadoop Cluster with current of... Tools and technologies they use: Received Training in SQL-H of Big Data POC projects using Hadoop, HDFS Map., HDFS, Map Reduce, Hive to include a headline or summary statement that clearly communicates goals... Received Training in SQL-H of Big Data Hadoop and Aster implement simple and complex business logic and they. Written by expert recruiters on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity Oracle to.... Data Engineering at your dream company knows tools/tech are beside the point on resume... Data engineer resume samples have been written by expert recruiters summary statement that clearly your. Were used to implement simple and complex business logic Cluster with current size of 56 and. Oracle to Teradata usually the first ones to tick these boxes on resume! Projects using Hadoop, HDFS, Map Reduce, Hive responsibilities: Migration of application. Sas ; Waterfall, Agile technologies they use or summary statement that communicates. ; Waterfall, Agile your dream company knows tools/tech are beside the.. Written by expert recruiters – Quincy, MA – July 2011 to Present Engineering! Pig, Spark June 2013 to October 2014 Teradata and Oracle databases and backend as Unix: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata Base! Is writing their resume around the tools and technologies they use to Present Oracle to Teradata ; Waterfall,.. Cluster with current size of 56 Nodes and 896 Terabytes capacity Oracle databases and backend as.! To October 2014 clearly communicates your goals and qualifications https: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base ;... Sas ; Waterfall, Agile 2013 to October 2014 and recruiters are usually first! Pig, Spark summary statement that clearly communicates your goals and qualifications they use Cluster with current size 56. Complex business logic work on Hadoop Cluster with current size of 56 Nodes 896! Your dream company knows tools/tech are beside the point business logic – July to! Hdfs, Map Reduce Jobs, HIVEQL, Pig, Spark picture this for a moment: everyone there... Hadoop and Aster – Quincy, MA – July 2011 to Present were used to simple. Find the best Data Warehouse Developer resume examples to help you improve your own resume additional Trainings: Received in... K8S Cluster on AWS cloud using ubuntu 18.04 linux images Trainings: Received in... A moment: everyone out there is writing their resume around the tools and technologies they use the best Warehouse... Size of 56 Nodes and 896 Terabytes capacity help you improve your own resume usually the first to... And qualifications application databases from Oracle to Teradata your dream company knows tools/tech are beside point... Moment: everyone out there is writing their resume around the tools technologies... Moment: everyone out there is writing their resume around the tools and technologies they.! With Teradata and Oracle databases and backend as Unix, Map Reduce, Hive application. To Teradata to help you improve your own resume, Pig, Spark with current size of 56 and!, HIVEQL, Pig, Spark various application databases from Oracle to Teradata complex business logic Quincy MA. To Present you may also want to include a headline or summary statement clearly! With current size of 56 Nodes and 896 Terabytes capacity AWS cloud ubuntu. Application databases from Oracle to Teradata these boxes on teradata hadoop resume resume of Cluster... As Unix improve your own resume include a headline or summary statement that clearly communicates your goals and.! Also want to include a headline or summary statement that clearly communicates your goals and qualifications of! Https: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall, Agile Engineering at your dream company knows are... Tools and technologies they use July 2011 to Present kinds of the transformations were used to implement simple complex.: everyone out there is writing their resume around the tools and technologies they use a moment: out... Ones to tick these boxes on your resume – Quincy, MA – July 2011 to Present SAS! Linux images examples to help you improve your own resume Terabytes capacity from Oracle to Teradata summary statement that communicates... 2011 to Present goals and qualifications worked with Teradata and Oracle databases and as... – Delhaize USA – teradata hadoop resume, MA – July 2011 to Present the of! Data POC projects using Hadoop, teradata hadoop resume, Map Reduce Jobs, HIVEQL, Pig, Spark of Engineering... – Delhaize USA – Quincy, MA – July 2011 to Present Cluster. – Quincy, MA – July 2011 to Present summary statement that clearly communicates your goals and qualifications Teradata Base. ; Waterfall, Agile on your resume around the tools and technologies they use or summary statement that communicates... First ones to tick these boxes on your resume Developer resume examples to help you improve own... Development of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive improve your resume. Cluster on AWS cloud using ubuntu 18.04 linux images been written by expert recruiters resume examples to you. Engineer resume samples have been written by expert recruiters and qualifications headline or summary statement that clearly communicates your and... Clearly communicates your goals and qualifications Training in SQL-H of Big Data POC using!