Key Skills: AbInitio, Oracle, Unix/Shell Scripting Detailed Job Description for AbInitio Developer at Irving, TX: 8 years of Experience in Development ProjectsVery strong communication and intrapersonal skillsAbility to work in a high-paced atmosphere.Self-motivated and demonstrate initiative in tackling work, while following software development best practices and company guidelinesAbility to communicate clearly and logically, and present developed features from time to time.Resources need to b
In this role, you will be responsible for building data interface solutions. Additionally, you'll be writing APIs and ETL scripts to load data into Snowflake's cloud data warehouse and integrate with other systems along with working with the Data Warehouse team. Capable of aiding in the creation and integration of novel applications and services that enhance fundamental business operations. Design, develop, and implement Python services within AWS Lambdas. Foster productive collaboration among d
We have an immediate requirement for Data Analyst for our client.Please review the job description given below and respond back with your updated resume, if your interested. Job Title -Data Analyst Location: Texas 5 days onsite in a month till Aug and 2 weeks onsite from September. Looking for 8+ years of Exp Requirements: health care Domain, claim Requires Skills: High level understanding of the below are all important. More of an analytical background than an engineering background. ETL data m
Hi, Please share your updated resume if you are fine with the below role. Role: Data Analyst Location: Westlake TX Onsite Only Local Candidates will work for this Special Instructions: Business Unit: Workplace Investing (But a Health Care Product) Location: TX only. Requirements: health care experience. Requires Skills: ETL data movement knowledge Snowflake Pipeline experience AWS cloud experience Some CI/CD claims data experience Nice to have: Azure could work instead of AWS High level under
Innova Solutions is immediately hiring for Data Analyst. Hybrid Role: 3 days Onsite 2 days Remote Job Role: Data Analyst Job Duration: Long Term Job Location: Plano, TX Role and Responsibilities: Description: The Claims Technology team is looking for a mid-level Data Architect/Analyst. Requirements As a Data Architect/Analyst, the candidate will work closely with business and technology partners to deliver solutions. Data consultant with deep programming knowledge understanding complex data rel
Hi, Position Title: Software Developer (Multiple Openings) Location: Plano, TX Duration: Full Time Duties: Under supervision, will be responsible for the following: Requirement analysis, design, development and testing activities.High-level and low-level design documentation.Develop functional specification documentation.Perform data processing using ETL tools.Code coverage using TDD approach and unit testing.Monitor project progress.Code & Test case reviews.Work with development and testing te
About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 9
Role : ETL Python Developer Location : Westlake, TX (Onsite) Duration : Full Time Job Description:: Skills: Python, IICS/PowerCenter, Google Cloud Platform, SQL Expert in Python scriptingGood knowledge on ETL tools like IICS/PC.Google Cloud Platform Data engineer with SQL, Python & Strong Communication skillsGood knowledge of Python scripting and decent knowledge of UNIX / Bash scriptingMandatory skills with SQL queries.Must have Cloud exposure (Google Cloud Platform) & cloud ETL tools Dataflow
Qualifications: 15+ years of experience (including 7+ years in Azure) in Data Engineering, ETL & Data WarehousingShould have excellent leadership, communication & client-facing skillsShould have in depth knowledge on Azure Data Factory, Databricks, SSIS, Azure Functions, Logic Apps, Azure Event Hubs, PySpark, Spark Streaming, ADLS Gen2, Azure Blob, Azure Synapse, Azure SQL, SQL Server and good working knowledge on Tableau.Should have executed at least 2 Azure Cloud Data Warehousing projectsShoul
The Expertise and Skills You BringBachelor s Degree or higher in Computer Science.Must have minimum of 6-9 years of software testing experience.Must have 2-4 years of ETL process validation experience.Must have ability to create and modify complex SQL queries.Must have 3-5 years of Java/Spring coding experience.Must have experience with API automation Rest Assured, PostmanMust be familiar with Behavior Driven Development (BDD)/CucumberA strong analytical mind set. Ability to cut through vast det
Job Title: Machine Learning Architect Location: Dallas, TX (Hybrid) 2 days work from home Key Responsibilities: Independently design, develop, and deploy NLP algorithms and models, focusing on text classification, entity recognition, sentiment analysis, and language generation.Utilize Azure OpenAI services, including Azure Language Understanding (LUIS) and Azure Text Analytics, to tackle complex language-related challenges effectively.Develop and maintain robust data processing pipelines for lar
Our client is looking for a Snowflake Data Management Specialist with a minimum of 8 years of experience specializing in Analytical data warehousing. The ideal candidate will have at least 3 years of hands-on experience with Snowflake cloud warehouse and its features, along with a good understanding of cloud AWS architecture and its services. Key Responsibilities: Develop and analyze SQL and PL/SQL procedures for data integration.Build ETL pipelines in and out of data warehouse using Python and
Title: ETL & Data Warehousing Lead Location: Dallas Duration: 12 month Qualifications: 15+ years of experience in ETL & Data WarehousingShould have excellent leadership & communication skillsShould have in depth knowledge on SSIS ETL Tool and good working knowledge on Power BIShould have worked on data sources such as SAP and SalesforceShould have very good knowledge of SSIS (ETL Tool), Azure Cloud, ADF, Azure Synapse Analytics & Azure Hub EventsShould have executed at least 2 Azure Cloud Data
Qualifications: 15+ years of experience (including 7+ years in Azure) in Data Engineering, ETL & Data WarehousingShould have excellent leadership, communication & client-facing skillsShould have in depth knowledge on Azure Data Factory, Databricks, SSIS, Azure Functions, Logic Apps, Azure Event Hubs, PySpark, Spark Streaming, ADLS Gen2, Azure Blob, Azure Synapse, Azure SQL, SQL Server and good working knowledge on Tableau.Should have executed at least 2 Azure Cloud Data Warehousing projectsShoul
Realize your vision and join this company who has been a leader in human health for the last 60 years. In our mind s eye, we all have a picture of what the future holds, this is your opportunity to see it in real life!! This industry leader is going through a complete redesign of their systems and is expanding their team with the addition of an Enterprise Data Architect to support the data needs of the overall system redesign effort. You will work with the data from an enterprise perspective in
About US: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90
Senior Power BI Engineer/ Alteryx/SQL Dallas/Plano, TX- 12 months Contract RESPONSIBILITIES Job Description:Requesting resources to be in Dallas Plano TX area. They will need to come into the client office once in 2 weeksResources will be working in Data Analytics Visualization projects, developing workflows and quality checks in AlteryxCurrently use an ETL data cleansing tool called Alteryx, so knowledge in Alteryx is a mustIdeal for resources to have some Healthcare experience.Senior BI Develo
We are looking to hire Sr. Devops Lead / Architect for Dallas, TX / Pittsburgh, PA Required Skill set: Experience: 10-15 years, DevOps Lead with IC role. Language - Python, Shell scripts, SQL, pyspark, Java (optional), React or Angular Js. Database: HDFS, Oracle, Teradata, Big data, Parquet files, ELK, Graph DB OS: Linus, Unix, windows - working on Secured VM setup and cluster Tools: Open Shift Container Platform, Jenkins, Kubernetes, Bitbucket, Putty, Secured CRT, ETL tools like Teradata, info
7+ years relevant and progressive data engineering experience Deep Technical knowledge and experience in Databricks, Python, Scala, Microsoft Azure architecture and platform including Synapse, ADF (Azure Data Factory) pipelines and Synapse stored procedures Hands-on experience working with data pipelines using a variety of source and target locations (e.g., Databricks, Synapse, Data Lake, file-based, SQL database) Experience in engineering practices such as development, code refactoring, and l
Title: Google Cloud Data Architect Job Description: Capgemini is seeking a Google Cloud Data Architect who will be responsible for a large-scale Cloud Migration program for financial services. Responsibilities: The role is for large scale Cloud Migration program for a financial services client.The focus is on migration of data and analytical workloads from on premise systems like SQL Server, SSIS, to Google Cloud Platform platforms including Big Query, IAM,GCS, Dataflow, Composer, Data Proc, etc