Overview
Skills
Job Details
Big Data Application Developer with Databricks
Columbia, SC (Day 1 onsite - Locals only)
Must have linkedin and 10+ years of exp.
Experience:
Minimum 10+ years in application development, systems testing, or similar roles
Technical Proficiency:
Advanced understanding of development, QA, and integration methodologies
Strong command of programming languages, systems analysis, and software design
Experience working across mainframe, midrange, or PC/LAN environments
Familiarity with project management practices
Excellent analytical, problem-solving, and communication skills
Strong team collaboration and leadership capabilities
Required Technologies:
Cloud & Big Data (AWS):
AWS State Machines, CDK, Glue, Lambda, CloudFormation, CloudWatch
S3, Glacier Archival Storage, DataSync, Lake Formation, AppFlow
RDS PostgreSQL, Aurora, Athena, Amazon MSK
Apache Iceberg, Spark, Python, TypeScript
Nice to Have:
AWS Redshift
Databricks (Delta Lake, Unity Catalog, Data Engineering)
AI/ML Tools: Amazon Bedrock, AWS SageMaker (Unified Studio), R Studio / Posit
Kafka, Hive, Hue, Oozie, Sqoop
Git, Git Actions, IntelliJ, Scala
Work Environment:
Fast-paced, customer-focused, multi-platform development team
Project-oriented with potential 24/7 support based on business needs
Emphasis on collaboration, innovation, and continuous improvement
Thanks,
KK