logo

View all jobs

Senior Data Engineer -Python, SQl, EC2

Remote, Oregon
Mavensoft is seeking Applications for a  Senior Data Engineer position from qualified, interested, and available candidates for 12 months contract assignment with a leading company in Beaverton, OR. If you are interested and meet below requirements, please email, or apply online for immediate considerations with your salary expectations.

Job TitleData Engineer- Python, EC2 & SQL
Duration: 12 Months
Location: 100% Remote

Key Skills: Python, SQL, Relational Database, Databases, Amazon EC2

Data Engineering Team:
The Consumer Services Data Engineering Team has a new data warehouse and needs a talented engineer to help drive the team across milestones and support our product roadmap.

Role responsibilities:
  • Lead, design and implement features in collaboration with team engineers, product owners, data analysts, and business partners using Agile / Scrum methodology
  • Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes
  • Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the EMR ecosystem
  • Build utilities, user defined functions, and frameworks to better enable data flow patterns
  • Research, evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing
  • Define and apply appropriate data acquisition and consumption strategies for given technical scenarios
  • Build, incorporate, and drive teams to write automated unit tests and integration scripts using tools like the Python test unit framework
  • Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to
  • Work across teams to resolve operational and performance issues and to deliver data for reporting in Tableau
  • Implement security control around sensitive data

Requirements:
  • MS/BS in Computer Science, or related technical discipline
  • 5+ years of industry experience and 5+ years of relevant big data/ETL data warehouse experience building data pipelines
  • 2+ years leading a team of data engineers – ie mentor, improve processes, peer training, long term best practices, coding standards
  • 5+ year experience in Python and Snowflake. Strong programming experience in Python
  • Professionally worked with APIs to extract data for data pipelines
  • Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc.
  • Ability to architect, design and implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena
  • Troubleshooting production issues and performing On-Call duties, at times.
  • Working knowledge with workflow orchestration tools like Apache Airflow
  • Hands on experience with performance and scalability tuning
  • Professional experience in Agile/Scrum application development using JIRA
  • Experience working in a public cloud environment, particularly AWS
  • Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git
  • Professional experience in data design and modeling
  • Demonstrated experience developing in a continuous integration environment using tools like Jenkins, Bamboo, or TeamCity CI Frameworks.
  • Demonstrated ability to maintain the build and deployment process through the use of build integration tools
  • Working experience and communicating with business stakeholders and architects
  • Demonstrated experience implementing security around sensitive data
  • Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, Signal FX and/or Splunk
Required Soft Skills:
  • Desire to lead collaboratively with your teammates to come up with the best solution to a problem
  • Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment
  • Excellent problem-solving and interpersonal communication skills
  • Strong desire to learn and share knowledge with others
  • Passionate about data and striving for excellence
  • Desire to learn and understand the business and communicate with business stakeholders to accomplish business rules transformations and data validation while coding
  • Understands the importance of data security and privacy
Nice to Have:
  • Call center data at a global level


    Email your resume to: usjobs@mavensoft.com
    To learn more about Mavensoft visit us online at http://www.mavensoft.com/

     
Powered by