A remote contract opportunity is available for a Data Engineer who will be responsible for understanding, preparing, processing and analyzing data to make data valuable and useful for operations decision support.
- Minimum 3 years of designing, building and operationalizing large-scale enterprise data solutions and
applications using one or more of GCP data and analytics services in combination with 3rd parties –
Spark, Cloud Data Proc, Cloud Dataflow, Apache Beam, Big Table, Cloud Big Query, Cloud Pub Sub, Cloud
- Minimum 1 year of hands-on experience analyzing, re-architecting and re-platforming on-premise data
warehouses to data platforms on GCP cloud using GCP/3rd party services
- Minimum 1 year of designing and building production data pipelines from ingestion to consumption within
a hybrid big data architecture, using Java, Python, Scala etc.
- Minimum 1 year of designing and implementing data engineering, ingestion and curation functions on
GCP cloud using GCP native or custom programming
- Minimum 1 year of experience in performing detail assessments of current state data platforms and
creating an appropriate transition path to GCP cloud
- Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale
- 1 year of hands-on experience architecting and designing data lakes on GCP cloud serving analytics and
BI application integrations
- Minimum 1 year of experience in designing and optimizing data models on GCP cloud using GCP data
stores such as BigQuery, BigTable
- Minimum 1 year of experience integrating GCP or 3rd party KMS, HSM with GCP data services for
building secure data solutions
- Minimum 1 year of experience introducing and operationalizing self-service data preparation tools (e.g.
Trifacta, Paxata) on GCP
- Minimum 1 year of architecting and implementing metadata management on GCP
- Architecting and implementing data governance and security for data platforms on GCP
- Designing operations architecture and conducting performance engineering for large scale data lakes a
- Craft and lead client design workshops and provide tradeoffs and recommendations towards building
- 2+ years of experience writing complex SQL queries, stored procedures, etc
- Google Cloud Platform certification is a plus
- Experience with CI/CD pipelines such as Concourse, Jenkins
- Experience with AtScale, Airflow (DAGs) a plus
Don’t meet every single requirement? Studies have shown that women and people of color are less likely to apply to jobs unless they meet every qualification. At Revel IT, we are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about this role, but your experience doesn’t align perfectly with every qualification in the description, we encourage you to apply anyway. You might be the right candidate for this or our other open roles!
ABOUT REVEL IT:
Revel IT (formerly known as Fast Switch) is one of the fastest-growing, privately held, IT Staffing companies in the nation. Our client base includes 32% of the Fortune 25. We have major offices in Dublin, OH, Phoenix, AZ, Los Angeles, CA, and Austin, TX and are rapidly expanding into new markets from coast to coast.
WHY REVEL IT:
- In addition to standard health and 401k benefits, we offer referral bonuses and training/continuing education opportunities.
- 5-year client retention: 99%
- No. 1 supplier with customers: 53%
- Top 3 supplier with customers: 77%
- Consultant retention: 94%
Revel IT is an Equal Opportunity Employer. Revel IT does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
To apply for this job please visit www.revelit.com.