Data Engineer

Aidetic

Posted: 11 months ago

Company Website
https://cutshort.io/jo...
Position type
full time
Job source
Cutshort
Category
programming
Remote
No
Salary
4 - 15 lacs/annum
Job location
Bengaluru (Bangalore)
About

Responsibilities:


● Designing, building and maintaining efficient, reusable, and reliable architecture and code. ● Participate in the architecture and system design discussions

● Independently perform hands on development/coding and unit testing of the applications

● Collaborate with the development and AI teams and build individual components into complex enterprise web systems

● Work in a team environment with product, frontend design, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle

● Architect and implement CI/CD strategy for EDP

● Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred) ● Designing, building and maintaining efficient, reusable, and reliable architecture and code.

● Ensure the best possible performance and quality of high scale web applications and services ● To identify and resolve any performance issues

● Keep up to date with new technology development and implementation

● Participate in code review to make sure standards and best practices are met

● Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift

● Migrate data from AWS Dynamodb to relational database such as PostgreSQL

● Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift

● Work closely with the Data Scientist leads, CTO, Product, Engineering, DevOps and other members of the Ai Science teams

● Collaborate with the product team, share feedback from project implementations and influence the product roadmap.

● Be comfortable in a highly dynamic, agile environment without sacrificing the quality of work products.


Position Requirements:


● Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience

● 5+ years of experience as Data application developer

● AWS Solutions Architect or AWS Developer Certification preferred

● Experience implementing software applications supporting data lakes, data warehouses and data applications on AWS for large enterprises

● Solid Programming experience with Python, Shell scripting and SQL

● Solid experience of AWS services such as CloudFormation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DataSync, DMS, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.

● Solid experience implementing solutions on AWS based data lakes.

● Experience in AWS data lake/data warehouse/business analytics/

● Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS

● Knowledge of ETL/ELT

● End-to-end data solutions (ingest, storage, integration, processing, access) on AWS ● Experience developing  business applications using NoSQL/SQL databases.

● Experience working with Object stores(S3) and JSON is must have

● Should have good experience with AWS Services – Glue, Lambda, Step Functions, SQS, DynamoDB, S3, Redshift, RDS, Cloudwatch and ECS.

● Should have hands-on experience with Python, Django

● Great knowledge of Data Science models

● Plus to have knowledge on Snowflake


Nice to have:


● Solid experience in AWS AI solutions such Recognition, Comperhind and Transcribe

● Python, NodeJS, .NetCore, C#, Reactjs, RestAPI, Microservices, Postman, GraphQL, Mongo, Linux, Javascript, HTML5, CSS, Django


For direct application fill the Form: https://forms.gle/z1Zhz32oHkNmANFV8

Skills:- Elastic Search, MongoDB, SQL, Big Data, Apache HBase, Spark, Apache Kafka, Tableau, Windows Azure and ETL

Subscribe to our daily job alerts

Sign up for our newsletter to stay up to date with new jobs posted on Profilehunt

Please confirm your email address once you subscribe.