Develop Python software in a fun startup like environment. You will develop and manage large data pipelines using Python, PostgreSQL, and AWS services.
- Drive the advancement of data infrastructure by developing and implementing underlying logic and structure for how data is set up, cleansed, QA’d and stored.
- Architect, design and develop resilient pipelines using a variety of different technologies including Airflow.
- Automate, test, and harden all data workflows using tools such as AWS triggers and Lambda handlers.
- Bachelor’s Degree (or equivalent experience) in Computer Science or Data Science.
- +3 years experience in processing data with Python, SQL, and other data tools.
- +2 years experience on back-end data systems that create, process, and clean data.
- +2 years of experience in developing automated ETL solutions.
- Experience using Python and relevant packages for data engineering tasks (Pandas).
- Amazon Web Services experience (AWS) - RDS, S3, Lambda, Glue. Spark/Glue or similar is an important skill for this position.
- Experience maintaining and updating data pipelines.
- Ability to guide and provide technical leadership to other engineers working on data pipelines.
- Experience accessing data via Web Services and external/non-SQL datasets.
Nice to have skills (PLUS):
- Familiarity with wrangling data frames in tools such as Spark or Pandas
- Experience with data pipeline tooling such as Apache Airflow or similar
We offer amazing benefits or all our workers and try to make the process as easy as possible for any candidate with interest in working in this amazing company.
- Remote work
- Telework costs support towards Electricity and Internet.
- Permanent - Full Time Job
- Work from Mon through Fri. WE LEAVE EARLY ON FRIDAYS!
- Upper MX Law Benefits including Medical Insurance with Dental Discounts.
- 100% Payroll scheme
- Career Path
- Cool offices and a collaborative environment
- American Company culture
- Health & Wellness program for employees
Back to All Job Postings