HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies. At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life an
Job Title: Lead Google Cloud Platform Data Engineer Location: Minneapolis, MN (Onsite) Position type: Contract Key Role Requirements Bachelor's degree in computer science, or related field10+ years of experience with a proven experience in data platform and big data engineeringExperience on ETL Datastage and Control MHands-on Experience in SQL is mandatory. Quickly learning skills to adapt Google Cloud Platform cloud native SQL (Bigquery).Hands-on experience in Python is mandatory. Experience
Bachelor’s Degree in Computer Science or a related discipline 5+ years of applicable engineering experience Experience in Google Cloud Platform, Big Query Strong proficiency in Python with an emphasis in building data pipelines Ability to write complex SQL to perform common types of analysis and aggregations Experience with Apache Airflow or Google Composer Detail-oriented and document all the work Ability to work with others from diverse skill-sets and backgrounds Good to have: Google Cloud Pla
Onebridge is a Consulting firm with an HQ in Indianapolis, and clients dispersed throughout the United States and beyond. We have an exciting opportunity for a highly skilled Google Cloud Platform Data Engineer to join an innovative and dynamic group of professionals at a company rated among the top “Best Places to Work” in Indianapolis since 2015. Google Cloud Platform Data Engineer | About You As a Google Cloud Platform Data Engineer, you are responsible for designing, building, and maintainin
Role: Google Cloud Platform Data Engineer Duration: 6 months to Hire Location: Dallas, TX - F2F Interview Must have skills: Google Cloud Platform CloudPython and SQLAbility to test their own code (production environment)CI/CDData Warehousing Looking to hire a Senior Data Engineer for a contract to hire position. This position will hybrid based in the Dallas TX requiring in office 3 times a week.Position SummaryThe Senior Data Engineer serves as a developer for data extract, transform and loa
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies. At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life an
Role: Data Engineer – Google Cloud Platform Full Time Dearborn MI (100% Remote) Overall Experience 5+ years Google Cloud Platform At least 1 year – Hands-on experience BigQuery Hands-on Experience. Good understanding of BigQuery is a must DataFlow Good understanding of basics SQL Solid understanding of writing SQL Queries Kafka Experience with both Publishing & Subscribing Data Ingestion and Transformation Hands-on experience Java / SpringBoot Mid Level experience
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies. At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life an
Google Cloud Platform Data Engineer – Remote Need super strong. Google Cloud Platform is a must. PySpark Python 12 + yrs of experience please – MUST Google Cloud Platform, PySpark is must to have – MUST. - Bachelor’s degree and 9+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets - 9+ years of experience as Data Engineer o
Role : Google Cloud Platform Data Engineer Location : Atlanta ,GA (OR) Tampa ,FL Description Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent work experience).Proven experience with Google Cloud Dataflow and Apache Kafka in a professional setting.Proficiency in programming languages like Java, Python, or similar languages.Strong understanding of data streaming concepts and distributed computing.Familiarity with cloud computing platforms, particul
We have an immediate requirement with our direct client. If you are interested, please send me your updated resume along with the following details Expected hourly rate: Availability: Current Location: Visa Status: Primary contact details: Role: Sr. Google Cloud Platform Data Engineer with Machine Learning Location: Irving, TX (Hybrid – 2 – 3 times/week need to go office on the need basis) Duration: 12+ Months Required Skills: Strong experience in Spark, need to be able to tune the spar
Need super strong. Google Cloud Platform is a must Pyspark python SUMMARY OF ESSENTIAL JOB FUNCTIONS - Design and develop analytical models and be the face to the data consumers - Perform data curation to meet the business requirements - Build batch and streaming data pipelines - Develop processes for automating, testing, and deploying your work - Identify risks and opportunities of potential logic and data issues within the data environment - Collaborate effectively with the global team
Role: Google Cloud Platform Data Engineer with Spark and Scala Location: Remote Skills required: Spark,Scala – 50% Google Cloud Platform,Data Proc,GCS ,Data lake – 25% Airflow,Hive,BIG Query – 25% Description: UST Global® is looking for a highly energetic and collaborative Senior Data Engineer(10+ yrs) for a 12-month engagement. Responsibilities: As a Senior Data Engineer, you will • Design and develop big data applications using the latest open source technologies. • Desired working in offsho
• Design and develop big data applications using the latest open source technologies. • Desired working in offshore model and Managed outcome • Develop logical and physical data models for big data platforms. • Automate workflows using Apache Airflow. • Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka. • Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support. • Learn our business domain and technology infrastructur
Hello, We are looking for, Position: Data Services Lead – Enterprise Analytics and Reporting Location: Remote Duration: 12 Months of Long-Term Contract with possible extension Skills Required: Minimum 8 years of recent experience in database development, relational databases, data lakes, data analytics solutions, and data warehousing, including cloud-native databases and modern cloud-based data warehouse systems.Strong hands-on experience in Oracle SQL and PL/SQL (Or other leading RDBMS s
Role: DATA Engineer + Google Cloud Platform + Any compliance technology + Actimize Compliance Technology – (AML Transaction Monitoring, CDD, Sanctions Screening etc.) Type: Contract #posistions: 3 Experience : 7 + yearsDomain - Actimize on CloudExperience in any Compliance Technology ( AML Transaction Monitoring, CDD, Sanctions Screening etc.)Experience with Actimize SAM on Cloud is a BIG PLUS or Actimize on PremExperience with any of the core banking platforms - FIS, Mission Lane, FircosoftExp
Job Title: Google Cloud Platform Data Engineer Lead Location: Remote: Duration: Long Term Job Description: We are seeking a highly skilled and experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in designing, building, and maintaining our data infrastructure to support our data analytics and business intelligence needs. You will work with Google Cloud Platform (Google Cloud Platform), BigQuery (BQ), Python, and Azure Data Factory (ADF) to ensure that
Title: Google Cloud Platform Data Engineer Location: Remote (Onsite visit at New Jersey office may be required occasionally) Types of position: Contract Duration: 6 months with Possible extensions Job Description: SUMMARY OF ESSENTIAL JOB FUNCTIONS Design and develop analytical models and be the face to the data consumers.Perform data curation to meet the business requirements.Build batch and streaming data pipelines.Develop processes for automating, testing, and deploying your work.Identif
We are seeking experienced Software Engineers to join our team in helping with our clients' IT needs. Reach out to pavanatvaluesoftcorpdotcom Job Title: Data Engineer - Druid Location: Remote Primary Skills: Apache DruidGoogle Cloud Platform (Google Cloud Platform)LinuxApache AirflowGoogle Compute EngineGitLabBigQuerySecondary Skills: PythonPySparkPL/SQLShell ScriptRequirements: Apache Druid: Proficiency in Apache Druid for data ingestion, storage, and querying.Google Cloud Platform: Experience
Position: GCP Cloud Data Engineer (W2-Remote) Location: Dearborn, MI (Remote) Duration: 12+ months MOI: Phone & WebEx Direct Client: FORD MOTORS Note: 1. U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are NOTABLE to sponsor H1-B at this time. - H1B Consultant who are willing to WORK ON OUR W2 (H1B TRANSFER) are welcome. "Job Title: GCP Full Stack Engineer Serve as GCP Cloud Data Services Engineer to support of Multi-Cloud vision - "Expanding Our Real Estat