An EC (Enterprise Content) Data Engineer with Spotfire expertise builds robust data pipelines (ETL/ELT) and develops advanced, interactive visualizations to enable data-driven decisions. They manage big data infrastructure, optimize data architecture, and create actionable, high-performance dashboards using Tibco Spotfire, AWS, often using SQL, web development and Python
Key Responsibilities
Data Pipeline Development: Design, construct, and maintain scalable ETL/ELT pipelines to ingest, transform, and load data from diverse sources into data lakes or warehouses.
Spotfire Visualization: Develop interactive, high-performance dashboards and reports using TIBCO Spotfire to visualize complex datasets, manipulating properties to create customized charts.
Data Architecture Optimization: Optimize big data systems AWS Redshift for maximum scalability, performance, and efficiency.
Data Quality & Governance: Implement data integrity checks, security measures, and data governance strategies to ensure data privacy and reliability.
Cross-functional Collaboration: Partner with data scientists, analysts, and business stakeholders to understand reporting needs and provide actionable, analytical data products.
Required Skills and Qualifications
Technical Skills: Advanced proficiency in SQL and Python.
Visualization Tools: In-depth knowledge of TIBCO Spotfire, including DXP configuration and data visualization, or comparable tools like Tableau/Power BI.
Data Infrastructure: Experience with cloud platforms AWS – RedShift
Education: Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
Common Experience Required
7+ years of experience in data engineering or analytics.
Proven experience in designing and implementing data warehouses.
Strong analytical, problem-solving, and communication skills.