ETL-Big Data - Sr. DeveloperPosted: 1 month ago
Senior Developer ETL_RDBMS_Big Data
The Senior Developer is responsible for development, support, maintenance and implementation of a complex project using Data Warehousing and ETL tools built on the Hadoop Eco System. Should have good experience in the application of standard software development principles. Should be able to work as an independent team member, capable of applying judgment to plan and execute tasks. Should be able to respond to technical queries / requests from team members and customers. Should also be able to coach, guide and mentor junior members in the team.
- Minimum 8 years in IT
- Minimum 3 years with hands on experience in RDBMS technologies
- Minimum 3 plus experience in java development
- Minimum 3 years' experience in Hadoop and NoSql solutions
- Minimum 2 years' experience in ETL and ELT tools
- Minimum 1year experience as a lead developer
- Position functions as a leader inside the Data Management Team to create scalable ETL and reporting solutions which meet the business needs of the organization
- Partner with Business Analysts, Subject Matter Experts from Business Units, and counterparts in IT to complete system requirements, technical specifications and process documentation for assigned projects
- Review, analyze, and evaluate the scope of the business problems presented and help identify viable technical solutions
- Develop ETL and data warehousing solutions with the customer product and services
- Drive data warehouse architectural decisions and development standards
- Create detailed technical specifications and release documentation for assigned ETL projects
- Ensure data integrity, performance quality, and resolution of data load failures
- Multi-task across several ongoing projects and daily duties of varying priorities as required
- Provide input into each phase of the system development lifecycle as required
- Ensure adherence to published development standards and hadoop best practices, resulting in consistent and efficient implementation of project components
- Ability to make good judgments, decisions, negotiate and good problem solver
- HDFS, HBase, YARN, SPARK, Oozie and shell scripting
- Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
- At least 5 years of Design and development experience in Java/Core Java related technologies
- At least 3 years of hands on design and development experience on Big data related technologies Hadoop, PIG, Hive, MapReduce & Webservices
- Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.
- Hands on Knowledge of working with Teradata, Netezza, Oracle, DB2 databases
- Strong understanding of File Formats ORC, Parquet, Hadoop File formats.
- Should have worked on large data sets and experience with performance tuning and troubleshooting.
- Knowledge of Spark, Spark Streaming, Spark SQL, and Kafka is a plus.
- Experience to Retail domain is a plus
- Experience and desire to work in a fast paced delivery environment
- Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
- Regular and expeditious travel throughout the United States is required to meet client needs and timetables
- Available to be stationed at and work from an out-of-town client site as needed