"If you consider yourself a Big Data expert and have been looking for the perfect position to showcase your expertise – this may be the job for you. We are currently seeking Engineers with Hadoop experience to join the “Seal Team of Big Data”!
This is an excellent 6 to 12 month assignment which will offer you experience and exposure to some of the largest organizations in the industry! It’s also an opportunity to increase your professional credentials as we invest in you to obtain a Hadoop Developer certification! Don’t let this opportunity pass you by…
What you will be doing:
-Provide design and development expertise for large-scale, clustered data processing systems
-Resolve technical issues in the environment
-Assist with the preparation of technical deliverables and review and demonstrate the system and applications1)
-Hortonworks Data Platform (HDP) Cluster Installation & Configuration
-Data Ingestion, Management, & Replication
-Plan and execute initial data extraction from Teradata into HDP.
-Configure HDP capabilities to handle regular archival of data from Teradata to HDP
-Implement data archival scripts and resources
-Implement data retention resources
-Informatica to HDP connectivity
-Data Modeling & SQL Data Access
-Translate Teradata Data Mode
-Travel to client sites Mon - Thur (full time pay when not engaged on assignment)
-Must have experience with Data source identification , mapping, modeling, ingestion and analysis
-Experience with providing services as a consultant and executing rapid POCsProven capability of impactful deliverable development
-5 -6+ years of deep hands on technical experience core Java
-3 + years Scripting (Linux / Shell, Python, Perl, etc)
-2 + years ETL experience
-1 - 2 years of hands on real project Hadoop experience (not experimental) with Pig / Hive /Map Reduce / OOzie, Sqoop Installing and configuring Clusters
-Some experience with Informatica ETL, Teradata DDL and Connector is a plus"