Bilingual Project Manager (Japanese & English) Remote Long Term Business-level fluency in both Japanese & English (verbal and written) to collaborate effectively w/ global teams & business users Hands-on experience w/ Relational Databases (RDBMS) & working knowledge of Enterprise Application Integration. Strong understanding of high-level database architectures, with the ability to support data modeling and collaborate with data analysts, architects, and ETL/BI developers. Best Regards, Praveen
Title: Project Manager (Bilingual English / Japanese) Duration: 12 months Location: Remote Mode of Interview: Video Description: Bilingual Project Manager / Production Support (Bilingual English / Japanese) The Project Manager is a part of the Treasury & HOMIS group within JRIA. This role plans and implements Head Office MIS (HOMIS) application in coordination with other technical teams in JRIA, business departments in SMBC and Tokyo application development/support teams. The role includes on-go
Data Analyst Juno Beach, FL/onsite 12+ months contract Proven experience as a Data Engineer or in a similar role. Strong knowledge of data warehousing concepts and best practices. Proficiency in SQL, Python, and ETL tools. Experience with AWS Redshift and DBT. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.
Job Title: Azure Data Engineer Location: Remote Job Description: Responsibilities: Design, develop, and maintain scalable data pipelines and architectures on Azure Cloud.Implement and manage Azure Data Lake solutions to store and process large volumes of data.Utilize Databricks for data processing, transformation, and analytics.Develop and maintain ETL processes using Azure Data Factory.Write efficient and optimized SQL queries to extract, transform, and load data.Work with relational databases
Candidate should be able to carry out the job below and have 3+ years of experience in below technical skills. Technical skills - 1. Advanced SQL 2. ETL Data warehousing 3. Programming (Python/Java) 4. Agile methodologies 5. BDD/ TDD * Design and Develop Automated Data Quality Tests: Create and maintain automated test scripts and frameworks for data quality validation using tools like Python, SQL, and data quality testing frameworks. * Develop data profiling and validation rules to identify data
Job Title: Azure Data Architect-Microsoft FabricDuration: Full time hire Location: Columbus Ohio (Local candidates preferred or East cost) Educational Qualification*Bachelor's Degree or higher in Information Systems, Computer Science, or equivalent experienceJD/Description of Role* (RNR) (Mandatory - Minimum 500 words)a)Acts as the Single point of the contact for the client for technical delivery b)Design and develop a robust data architecture that supports the organization's data needs, includi
Hi, This is Rehan from Feuji. I have below requirement with one of our client and if you are comfortable with the Job Description kindly share your updated resume with contact details. Job Title: SSIS Engineer Type: 8 Months Contract to Start Location: 100% Remote Description: Strong proficiency (5+ years exp) of Microsoft SQL Server Integration Services (SSIS) and MS SQL to extract transform client data into our data pipeline and utilize the SSIS ETL Tooling. Should you be interested, please
Qlik Developers/Architect Location: Remote Very Long Term Contract Qlik system admin tasksProvide technical expertise for all related Qlik hardware and softwareRequirements gatheringAnalysis of data sourcesCreation of data modelsDashboard designDashboard developmentETL developmentTesting (unit, user acceptance and quality assurance)Creation of end user documentationCreation of technical documentationPerform cutover activitiesCreation and delivery of training and training materialsEnd user train
**No Employer** Must have 15 to 20+ years of experience Mandatory Skills - Data Warehousing, Data Architecture, Data Modeling, solutioning and consulting. Job DescriptionHands-on expertise with Snowflake and related technologies in the Snowflake ecosystem (dbt/Fivetran etc.) for both batch & streaming use cases. Must have consulting and solutioning experience and play the role of an SME on the Snowflake ecosystem. Roles & ResponsibilitiesDesign & Develop ETL pipelines and Data warehouses using S
Required Skills & Experience:5+ years of experience in SQL development and data engineering. 3+ years of hands-on experience with Snowflake. 2+ years of experience with Control-M job scheduling. Experience with cloud platforms (AWS, Azure, or Google Cloud Platform). Proficiency in scripting languages (Python, Shell). Understanding of data modeling, ETL best practices, and performance tuning. Strong troubleshooting and debugging skills. Familiarity with DevOps tools (Git, Jenkins, CI/CD pipelines
Sensiple Inc is a New Jersey corporation with over two decades of expertise in technology-driven solutions specialising in Customer Experience, Contact Center Solutions, Digital Transformation, Cloud Computing & Independent Testing. With an expert team that has enriched experience in executing & developing sustainable IT strategies in Healthcare, Technology, Retail, Logistics, Education, Telecommunications, Government and Media, we help our diverse customers to envision the future.By developing
Client located in South Tampa is looking for a BI Engineer. This role will be on-site everyday for the first 90 days and then move to 1-2 days remote (hybrid). Summary: The BI Engineer will be responsible for both development of BI & Analytics design, development, testing, and problem resolution of software and web applications. Scope of work includes small system enhancements to major system projects. Applications may include custom-developed software, commercial packaged software, or open-sour
Position: Qlik Developer/Architect Location: Remote Duration: 8 Months About R Systems: R Systems is a leading digital product engineering company that designs and develops chip-to-cloud software products, platforms, and digital experiences that empower its clients to achieve higher revenues and operational efficiency. Our product mindset and engineering capabilities in Cloud, Data, AI, and CX enable us to serve key players in the high-tech industry, including ISVs, SaaS, and Internet compani
Requirements: 5+ years of experience in Master Data Management (MDM) development Strong expertise in ETL tools such as Informatica and MuleSoft Strong knowledge of data modeling concepts, including entity-relationship modeling and dimensional modeling Experience with data governance, including policies, processes, and standards Experience with data quality management, including profiling, cleansing, and standardization Strong problem-solving and analytical skills, with the ability to troubleshoo
We are seeking a dedicated and very strong Power BI Developer to join our team. The primary responsibility of this role is to develop and maintain business intelligence solutions that convert raw data into actionable insights. The ideal candidate will be proficient in using Power BI tools, have a strong understanding of data analysis, and be passionate about creating meaningful visualizations. Knowledge of Member enrollment in Payor will be advantage. Key Responsibilities: Data Analysis: Interac
**No Employer **Must have 15 to 20+ years of experience Mandatory Skills - Spark, SCALA, Kafka, Streaming, technical architecture, Data Bricks, Data Engineering, data quality, Data Governance. Job DescriptionHandson with Spark (dataframe), Spark SQL, Databricks, AWS Glue, Scala/Spark or PySpark, Kafka or another streaming technology. Good learning & cross skilling ability. Design & Architecture of big data systems. Experience in ETL, Data Governance, Data Quality, Good understanding of data oper
MDM Developer Remote Key responsibilities: Design and develop MDM solutions that meet business requirements and align with architectural standards Collaborate with cross-functional teams to identify, capture, and maintain high-quality master data Develop and maintain data integration processes using ETL tools such as Informatica and MuleSoft Monitor and maintain data quality, ensuring that data is accurate, complete, and up-to-date Develop and maintain data governance policies, processes,
Role: AFS to ACBS Conversion Location : REMOTE Description:Experience with data migration, ETL processes, and integration frameworks (e.g. Informatica)Understanding of data model of AFS Level and FIS ACBSExperience with databases (Oracle, SQL Server) and SQL scriptingExperience performing data mapping, transformation rules documentation and implementation
Note: This is a fulltime and remote opportunity. Role: Integration Lead (IICS, Spark, Azure Data Lake) Work location: Remote Job Description 10+ years of experience in Informatica ETL Design and Architecture, Data Analysis & Data ManagementDemonstrated expertise in designing and implementing data architecture solutions using Informatica Cloud and AzureImplemented an Informatica based ETL solution fulfilling stringent performance requirementsExperience in designing and implementing ETL processe
Role: Xactly Developer Location : Remote Duration : 6 to 12 months Responsibilities: Develop, implement, and maintain Xactly Connect integrations.Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications.Design and develop data integration solutions using Xactly Connect.Ensure data accuracy and integrity in all Xactly Connect processes.Troubleshoot and resolve issues related to Xactly Connect integrations.Perform regular system ma