**This position is on-site only. This is NOT a remote position.**
DISH Wireless is building a next-gen 5G network to disrupt the wireless industry and fuel innovation in transportation, health care, education, sustainability, city management, and agriculture.
We're driven by curiosity, pride, adventure, and a desire to win - and we're looking for people with boundless energy, intelligence, and an overwhelming need to achieve. Join us as we embark on our greatest adventure of all.
Opportunity is here. We are DISH Wireless. What you will be doing
DISH intends to radically disrupt the Wireless industry landscape and is in search of a Data Engineer that can help us make data-driven decisions to change the way the world communicates. This position will work as part of our Wireless Deployment team responsible for input and output of large amounts of data for Systems and Tools to help build DISH's Greenfield 5G network faster and more cost efficiently than has ever been done in the industry. This team is a key component to the success of DISH's wireless buildout.
- Acquire big data input from numerous partners. Key technologies may include Python, Elastic Logstash, Kafka, and NiFi.
- Normalize complicated data sources to convert potentially unusable data into a format that can be efficiently used by software and/or employees. Key technologies may include Spark, Lambda, Beam, Glue, and Flink.
- Aggregate data from multiple sources into a single location and format where correlation is possible. Key technologies may include SQL Server, MYSQL, PostgreSQL, Cassandra, Impala, Kudu, and Athena.
- Validate large amounts of data to ensure data quality in a variety of different ways depending on the data and its consumer. Key technologies may include Python and Excel.
- Garner key insights from data and communicate these findings to key stakeholders to help them make data-driven decisions. Key technologies may include Tableau, Grafana, Kibana, and R.
- Maintain a CI/CD pipeline for our data software to ensure we keep quality high and time to market low. Key technologies may include Gitlab.
- Learn industry standards and best practices surrounding data engineering in order to continuously improve our team and systems by implementing them.
- Create data that is valuable for the organization, either operationally to help drive decisions or financially to gain revenue. Key technologies may include Google Analytics and Qualtrics.
- Maintain a Local, Dev, Staging, and Production environment for our data software. Key technologies may include AWS, GCP, VMWare, Docker, and Kubernetes.
#LI-AC2 #LI-AC #LI-AC1 #DICE_AC Skills and experience
Successful Senior Data Engineer candidates have:
- Strong communication skills to work with partners internal and external to manage data flow into our infrastructure.
- Ability to read, analyze, and interpret common metrics used to measure and monitor operational performance, define problems, collect data, establish facts, draw valid conclusions, and provide clear and concise communication with a wide audience of internal departments.
- Growth mindset: Proven ability to quickly learn new concepts, processes, software, and development ideas.
- Bachelor's degree in Computer Science, Computer Engineering, Applied Math, Statistics, or a related technical degree.
- At least two years of experience using ETL (Extract, Transform, and Load) concepts and techniques on messy data sets and large databases.
- At least two years of experience with SQL-like query language and table design.
Ideal Senior Data Engineer candidates have:
- Masters and/or Ph.D. in Computer Engineering, Computer Science, Applied Math or Statistics, Telecommunications, or a related field.
- Proven ability to implement data-driven solutions in a production environment using tools such as Hadoop, Impala, Hive, NiFi, Athena, Redshift, ElasticSearch, BigTable, or Airflow.
- Experience using Cloud Native tools such as Kubernetes and Docker in private, public, and hybrid clouds.
- Experience applying machine learning and statistical modeling using common data science techniques such as clustering, regression - logistical and linear, confidence intervals, and pattern recognition.
- At least two years of experience using Tableau.
- Ability to travel and be on-call 24/7.
Compensation: $85,900.00/Yr. - $136,190.00/Yr.
From versatile health perks to new career opportunities, check out our benefits on our careers website.
Employment is contingent on Successful completion of a pre-employment screen, which may include a drug test.