Tredence is an advanced global analytics solutions company. We are one of the fastest-growing private companies in the country for three straight years, according to the Inc. 5000. Our capabilities range from Data Visualization, Data Management, Big Data, and Machine Learning. Our uniqueness is in building Scalable Big Data Solutions on-prem and cloud in a very cost-effective and easily scalable manner for our clients. We also come in with IP and pre-built analytics solutions in data mining, BI, and Big Data. To learn more, please visit our website at www.tredence.com.
You will be responsible for partnering with internal teams such as finance, marketing, sales, and operations to build next-generation Business Intelligence solutions using Big Data technologies such as Hadoop, Scala, and Spark. Tredence is a highly collaborative, agile work environment, so the ideal candidate would demonstrate excellent interpersonal skills and good business knowledge in addition to excellent technical depth.
What you'll do:
- Supporting our clients to design next-gen and future state architectures/solutions that support BI, ML, and analytics using cloud and on-prem solutions
- Design, build and maintain optimal data pipelines to move data from sources to downstream systems such as data lakes, data warehouses, and other storage solutions
- Developing platforms to support business intelligence and analytics capabilities using a range of database, data warehousing, and other solutions
- Implementing large scale, automated machine learning pipelines and model management capabilities
- Advising on and implementing solutions for data management, lineage, metadata management, data lifecycle policies, cataloging, and security
- Setup, implementation, and support for cloud infrastructure for data engineering and analytics use cases
- Minimum 4 years of experience in large-scale IT analytics projects.
- BS in Computer Science or Engineering.
- Strong experience Java 7 or above.
- One or more of the following data processing technologies: Hadoop Map Reduce, Spark, Kafka Streams, Flink, Storm, Apache Beam.
- Kafka (preferred) OR one of the mainstream queue systems: ZeroMQ, RabbitMQ, ActiveMQ
- Experience with some of the following GCP/Azure or AWS tools: EMR, Kinesis, Firehose, Redshift, RDS, S3 API, Lambda, SQS
- Experience with Presto, Hive, Impala or similar SQL based engine for Big Data
- Experience with Redis, Cassandra, MongoDB or similar NoSQL databases.
- Experience with any of the following message/file formats: Parquet, Avro, Protocol Buffer
Extra perks we offer:
- Take time for yourself: 21 days of PTO plus federal holidays pay.
- Stay physically/mentally healthy: Choose from a variety of very low cost medical, dental and vision plans to cover you and your loved ones, HSA/FSA.
- Plan for your future: 401k with a 5% match, yearly salary increases, relocation assistance, business travels reimbursement.