C2C
ankita.k@hirekeyz.com
We are seeking an experienced Hadoop Developer with 10+ years of strong hands-on expertise in Big Data technologies, Hadoop ecosystem, and enterprise data processing solutions. The ideal candidate should have experience designing, developing, and optimizing scalable data pipelines and distributed systems in a fast-paced enterprise environment.
Key Responsibilities
- Design, develop, and maintain scalable Big Data solutions using Hadoop ecosystem tools.
- Develop and optimize ETL pipelines for large-scale data processing.
- Work with structured and unstructured data from multiple data sources.
- Collaborate with cross-functional teams including Data Engineers, Architects, and Business Analysts.
- Ensure data quality, performance tuning, and system optimization.
- Implement data security and governance best practices.
- Troubleshoot production issues and provide timely resolutions.
- Participate in architecture discussions and technical design reviews.
Required Skills
- Strong experience with Hadoop ecosystem components such as HDFS, Hive, HBase, Sqoop, Oozie, Kafka, and Spark.
- Expertise in Spark SQL, PySpark, Scala, or Java.
- Hands-on experience with ETL development and data warehousing concepts.
- Strong SQL and Unix/Linux scripting knowledge.
- Experience working with cloud platforms such as AWS, Azure, or GCP is preferred.
- Good understanding of data modeling and distributed computing concepts.
- Experience with CI/CD pipelines and version control tools like Git.
- Strong analytical and problem-solving skills.
