Job Id: 20201004021
Company: DXC Technology
Job Role: Hadoop and Big Data Developer
Experience: 5-7 Year
Qualification: Bachelor’s degree in computer science, engineering or related field preferred
Job Location: Hyderabad
Salary: Best in Industry
Vacancies: Not Mentioned
Job Description DXC Technology Careers Job Vacancies for Hadoop and Big Data Developer
in October 2020:
5 to 7 years of experience in Hadoop and Big Data Technologies such as MapReduce, HDFS, S3, Sqoop, Hive, Hbase, Impala, Oozie, Spark Core etc
4+ years of experience in working on Data Integration projects on Hadoop.
Experience with AWS to provide counsel on the capabilities and limitations of an architecture.
Ability to architect streamlined data pipelines and deliver solutions leveraging various AWS platform services
Proficiency in security implementation best practices on Network Security Groups , Amazon EC2 Security Groups, AWS Key Management Service etc
Experience working in the SCRUM Environment.
Business requirement analysis, talk with local BU regarding business requirement analysis and help to shape the data pipeline on Big Data platform.
4+ years of experience administrating on UNIX systems with proficiency in Shell scripting.
2+ years of strong technical experience in architecting, development and deployment of AWS based solutions.
Excellent knowledge on Hadoop configuration files (Core-Site, HDFS-Site, YARN-Site and Map Red-Site).
Thorough understanding of Mainframe file formats, Relational DBs (SQL Server/Oracle/Mysql), sql script, sql performance optimization etc
Experience in ETL, Data ingestion and Migration/Moving on premise applications to cloud environments and building datalakes.
Knowledge of Hadoop distributed architecture and HDFS.
Hands-on experience with Hadoop distribution platforms like Cloudera, Hortonworks, MapR etc.
Familiarity with message brokers, Kafka, NoSQL such as HBase, Cassandra, MongoDB.
Demonstrated ability to learn new technologies quickly.