Cloud Consultant -BigData
Are you a Data Analytics specialist? Do you have Data Warehousing, Hadoop/Data Lake experience? Do you like to solve the most complex and high scale (billions + records) data challenges in the world today? Do you like to work on-site in a variety of business environments, leading teams through high impact projects that use the newest data analytic technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing?
At AWS ProServe India LLP, we’re hiring highly technical cloud computing architects to collaborate with our internal customers and partners on key engagements. Our consultants will develop, deliver and implement AI, IOT, and data analytics projects that help our internal customers leverage their data to develop business insights. These professional services engagements will focus on solutions such as Machine Learning, IoT, batch/real-time data processing, Data and Business intelligence.
You will be required to travel to client locations and deliver professional services when needed.
.Expertise – Collaborate with AWS field BD, pre-BD, training and support teams to help partners and internal customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift.
· Solutions – Deliver on-site technical engagements with partners and internal customers. This includes participating in pre-BD on-site visits, understanding requirements, creating packaged Data & Analytics service offerings.
· Delivery – Engagements include short on-site projects proving the use of AWS services to support new distributed computing solutions that often span private cloud and public cloud services. Engagements will include migration of existing applications and development of new applications using AWS cloud services.
· Insights – Work with AWS engineering and support teams in India to convey partner and internal customer needs and feedback as input to technology roadmaps. Share real world implementations and recommend new capabilities that would simplify adoption and drive greater value from use of AWS cloud services.
· Innovate – Engaging with the internal customer’s business and technology stakeholders to create a compelling vision of a data-driven enterprise in their environment
Bachelor’s degree, in Computer Science, Engineering, Mathematics or a related field or equivalent professional or military experience ·2+ years of experience of IT platform implementation in a technical and analytical role · 2+ years of experience of Data Lake/Hadoop platform implementation · Hands-on experience in implementation and performance tuning Hadoop/Spark implementations. · Experience Apache Hadoop and the Hadoop ecosystem · Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro) · Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) · Experience developing software code in one or more programming languages (Java, Python, etc)
· Hands on experience as part of large-scale global data warehousing and analytics projects · Ability to collaborate effectively across organizations · Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, BI reporting and Dashboard development · Demonstrated industry efficiency in the fields of database, data warehousing or data sciences · Implementing AWS services in a variety of distributed computing, enterprise environments · Customer facing skills to represent AWS well within the customer’s environment and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation · Desire and ability to interact with different levels of the organization from development to C-Level executives
Apply for the Job