Our Client is looking for a QA Data Manager. Below is the detailed job description for your reference, if interested kindly reply back with your updated resume and contact info.
Duration: Long Term Contract
Location: San Ramon, CA (Remote OK)
Strong Knowledge of Relational databases, their objects, and Data Warehouse models.
- Experiences in writing Complex SQL Queries, created/modified DB objects, performance tuning, and table partitioning.
- Proven Experience in Data Migration Projects, a complete understanding of STM, DM Scope, and On boarding concepts in the Data Migration path.
- Knowledge of ETL tools preferably SSIS, Boomi, Mulesoft, etc., and their ETL data validations.
- Hands-on experience in Data Automation frameworks like Cucumber, Airflow, ETL validator, etc.
- minimum working experience of XXX in Kafka Streaming platform and its streaming process.
- Experiences in Data Streaming Automation, API automation, and its CICD implementation utilizing Source Control and Build Apps.
- Experiences in Google Cloud Platform Cloud Storages and Data Services:
Cloud Migration, Big Query, Google Cloud Platform Buckets, Data Flow, Data fusion, DataProc etc.
- Knowledge of Containers & Micro services would be a great plus.
- Must have Knowledge of at least one of the Advanced Reporting tools like Micro strategy, Qlickview, Looker, and Tableau and report automation.
- Significant understanding of Splunk, and New Relic for proactive reporting and alert in QA Organization.
- Must know at least one OOPS language like Java, Python, .Net, etc.
WE ARE AN EQUAL OPPORTUNITY EMPLOYER.