Job Description: • Excellent skills in writing complex SQLs • Very good understanding of Data Migration & ETL processes • Hands-on experience in GCP cloud data implementation suite such as BigQuery, Pub Sub, Data Flow/Apache Beam (Java/Python), Airflow/Composer, Cloud Storage, etc. • Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM • Strong experience with relational and non-relational databases in Cloud with billions of records (structured unstructured data) Ability to design develop data flow pipelines from scratch Excellent problem-solving debugging skills Regulatory and Compliance work in Data Management • Showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions. • Identify internal/external data sources to design and implement table structure, data products, ETL strategy, automation frameworks and scalable data pipelines. • Must Haves: • Working knowledge of cloud architecture (Google Cloud Platform), Knowledge of relational databases (Microsoft SQL Server, MySQL, Oracle) • Working knowledge of RDBMS, GCP data pipeline, GCP storage, Python and SQL, DMS, GCP ETL, Sqoop and similar ETL tools etc. • Must have worked in atleast one ETL project. • Working experience in developing data pipeline • Data Engineering experience Hands on and deep experience working with Google Data Products • Work with Agile and DevOps techniques and implementation approaches in the delivery • Build and deliver Data solutions using GCP products and offerings. • Communication should be good. Job Title: GCP Data Engineer – On-Prem Database Migration Job Summary: We are looking for an experienced GCP Data Engineer to join our team and play a key role in our on-prem database migration to GCP. As a GCP Data Engineer, you will be responsible for ensuring the successful migration of our database to GCP and optimizing the database environment to meet our business requirements. You will work closely with our migration team, architects, and stakeholders to design and implement scalable, secure, and high-performance solutions. Responsibilities: • Work with stakeholders to understand the business requirements for the database migration to GCP • Design and implement scalable, secure, and high-performance database solutions in GCP • Collaborate with the migration team to ensure the successful migration of the on-prem database to GCP • Develop and maintain data pipelines and ETL processes to ensure the integrity and accuracy of data • Optimize the database environment to meet performance and availability requirements • Identify and resolve database performance and reliability issues • Develop and implement database monitoring and alerting solutions • Work with security teams to ensure the security and compliance of the database environment • Develop and maintain technical documentation related to the database environment and solutions Qualifications: • Bachelor's or Master's degree in Computer Science or related field • At least 3 years of experience in GCP data engineering, including database migration • Experience with database design, optimization, and performance tuning • Experience with ETL and data pipeline development and maintenance • Strong understanding of GCP services and architecture, including GCP BigQuery and Cloud SQL • Proficiency in programming languages such as Python and SQL • Familiarity with database security and compliance requirements • Excellent communication and collaboration skills • Ability to work independently and as part of a team in a fast-paced, dynamic environment We offer competitive salary and benefits packages, including opportunities for career growth and development. If you are passionate about data engineering and are interested in being part of an exciting on-prem database migration project to GCP, we encourage you to apply for this position.