Your key responsibilities
- Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
- Responsible for development, support, maintenance, and implementation of a complex project module
- Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
- Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
- Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
- complete reporting solutions.
- Preparation of HLD about architecture of the application and high level design.
- Preparation of LLD about job design, job description and in detail information of the jobs.
- Preparation of Unit Test cases and execution of the same.
- Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle
Skills and attributes for success
- Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
- Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
- Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
- Should have enough experience to work on Power Shell Scripting
- Able to guide the team through the development, testing and implementation stages and review the completed work effectively
- Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
- Primary owner of delivery, timelines. Review code was written by other engineers.
- Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
- Must have understanding of business intelligence development in the IT industry
- Outstanding written and verbal communication skills
- Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
- Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
- Should be able to orchestrate and automate pipeline
- Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark
To qualify for the role, you must have
- Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
- More than 6 years of experience in ETL development projects
- Proven experience in delivering effective technical ETL strategies
- Microsoft Azure project experience
- Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)
Ideally, you’ll also have
Skills:- ETL, ETL management, SQL and Azure Data Factory