Python Developer with GCP, Cloud, Data-Client W2

UNICOM Technologies Inc


Date: 9 hours ago
City: Chandler, AZ
Contract type: Contractor

Python Developer with GCP, Cloud, Data

Duration: 24-month contract with possible conversion to FTE

Location: Chandler, AZ only option - 3 days a week in office

$53.13/hr w2

Interviews: 1st round: Lead Engineers will be evaluating candidates. Since this is an engineering role, there will be a technical aptitude review, 60 min MS teams call then 2nd round: 30 min MS Teams meeting

In this contingent resource assignment you may: Consult on or participate in moderately complex initiatives and deliverables within Software Engineering and contribute to large-scale planning related to Software Engineering deliverables. Review and analyze moderately complex Software Engineering challenges that require an in-depth evaluation of variable factors. Contribute to the resolution of moderately complex issues and consult with others to meet Software Engineering deliverables while leveraging solid understanding of the function policies procedures and compliance requirements. Collaborate with client personnel in Software Engineering. Required Qualifications: 4 years of Software Engineering experience or equivalent demonstrated through one or a combination of the following: work or consulting experience training military experience education. JOB DESCRIPTION: Minimum 4 years of hand on experience with - Building data pipeline using big-data stack Hadoop Hive pySpark python - Amazon AWS S3 - Object storage security data service integration with S3 - Data modelling and database design. - Job Scheduler - Autosys - PowerBI Dremio - Unix/shell scripting CICD pipeline - Exposure in GCP cloud data engineering is a plus

Manager Notes: -The contractors need to be proactive they can't wait to be told what to do -Must be accountable along with the technical skills -The tech stack mentioned these are the technologies being used to build data pipelines -They need to model design the data build pipelines applying logic to the data to transform the data and troubleshoot -They should have strong understanding and implementation of Autosys -Ability to automate using spark Python Hadoop/Hive -Should have a fundamental background in database design MySQL or any standard database -Exposure to Cloud data engineering is a big plus not required -Financial services experience is a plus but not required-having domain knowledge is helpful Technical Assessment -We need a clear understanding of tech work experience they need to be able to describe the work they have done -Overall problem solving so given a problem how efficiently does their thought process drive towards a solution?

Skill Highlights- please indicate the # of years on each of the following skills:

  • Hadoop
  • Hive
  • pySpark
  • python
  • Amazon AWS S3
  • Data modelling
  • Database design
  • Job Scheduler - Autosys
  • PowerBI Dremio
  • Unix/shell scripting
  • CICD pipeline
  • GCP cloud data engineering ideal

How to apply

To apply for this job you need to authorize on our website. If you don't have an account yet, please register.

Post a resume